Works with both MySQL and MariaDB (see compatibility below)
MySQL dump that creates full backup for configured database, different scenarios can be configured.
- Highly configurable
- Specify mysqldump binary to be used
- File output location
- Dump data or Schema config templates
- Include or exclude tables per database
- Can Make use of
~/.my.cnf
- Optional retention period (by count of old backups)
- Optional compression with gzip
- Optional upload to AWS S3
- Optional metrics push to prometheus pushgateway
app/conf.sample.json
- Create backup of Zabbix DB with following scenarios (jobs):
- Schema
- All tables
- Backup Enabled
- Compression disabled
- Config data
- Exclude some heavy metric data and other non-essential tables
- Backup Enabled
- Compression enabled
- Metric data
- Include some heavy metric data bearing tables
- Backup Disabled
- Schema
- Use custom mysqldump binary
- Rely on
~/.my.cnf
for DB connection password - Keep last 3 backups locally
- Prometheus metric push enabled
- AWS upload enabled
Configuration is in JSON and the script expects it to be in the same directory with name conf.json
.
Any unspecified setting, will be taken from default my.cnf
file locations
host
Optional. DNS/IP of the database connection. Example: localhost
.
db
Optional. Name of the database connection.
user
Optional. User of the database connection.
password
Optional. User of the database connection.
More info: https://github.com/prometheus/pushgateway.
All below keys are mandatory
enabled
JSON Boolean type. The prometheus
key dict can be configured, but disabled this way.
host
DNS/IP of the prometheus pushgateway. Example: prometheus-pushgateway.pvc.com
.
job
Job title for prometheus pushgateway. Used in registry creation.
All below keys are mandatory
enabled
JSON Boolean type. The prometheus
key dict can be configured, but disabled this way.
bucket
AWS bucket name.
path
AWS path in bucket.
access_key
AWS access token.
secret_key
AWS secret key.
mysqldump_bin
Optional. mysqldump
binary full file path. Defaults to mysqldump available in $PATH
.
output_dir
Mandatory. Specify the output directory. Each run will have its own timestamped directory within.
keep_local_backups
Retention of how many latest backups to keep (including current)
jobs
Mandatory. JSON array of dicts. These dicts describe a backup job. See example.
jobs.name
Mandatory. Name for the backup job. This will later be used in prometheus push, if enabled.
jobs.type
Mandatory. See for more info below.
jobs.enabled
Mandatory. JSON Boolean type. Job can be configured, but disabled this way.
jobs.compression
Mandatory. JSON Boolean type. The dump for this job can be either enabled or disabled.
jobs.include
JSON Array of string values that represent the table names to be included in the backup. Only these tables will be backed up!
jobs.exclude
JSON Array of string values that represent the table names to be excluded from the backup. These tables will NOT be backed up!
This setting will tell mysqldump what kind of backup we want.
Currently predefined and allowed string values are schema
or data
.
Will use following mysqldump options: --no-data --triggers --routines --events
Will use following mysqldump options: --no-create-info --skip-triggers
One can choose to enable different levels of log with optional -l|--log-level <level>
.
Default is going to is info
level, which will log everything that is "info" and more severe. i.e. warnings, errors etc
Available levels
- debug
- info
- warning
- error
- critical
Sample log with debug level
*/nepms-backup-mysql/venv/bin/python */nepms-backup-mysql/app/app.py --log-level debug
2019-03-25 15:51:22,296.296 INFO Reading conf file "*/nepms-backup-mysql/app/conf.json"
2019-03-25 15:51:22,297.297 INFO Running dump for job: db_schema
2019-03-25 15:51:22,297.297 INFO Creating dump directory: /data/db/backups/2019-03-25_155122
2019-03-25 15:51:22,301.301 INFO Creating dump file: '/data/db/backups/2019-03-25_155122/db_schema.sql'
2019-03-25 15:51:24,654.654 INFO Finished creating dump file. Size: 154988 bytes. Duration: 2353 ms
2019-03-25 15:51:24,654.654 INFO Compressing file: '/data/db/backups/2019-03-25_155122/db_schema.sql'
2019-03-25 15:51:24,667.667 INFO Finished compressing file. Size: 15418 bytes. Duration: 13 ms
2019-03-25 15:51:24,667.667 INFO Job has 'exclude' tables
2019-03-25 15:51:24,667.667 INFO Running dump for job: db_config
2019-03-25 15:51:24,667.667 INFO Creating dump directory: /data/db/backups/2019-03-25_155122
2019-03-25 15:51:24,672.672 INFO Creating dump file: '/data/db/backups/2019-03-25_155122/db_config.sql'
2019-03-25 15:51:47,364.364 INFO Finished creating dump file. Size: 244031995 bytes. Duration: 22692 ms
2019-03-25 15:51:47,365.365 INFO Compressing file: '/data/db/backups/2019-03-25_155122/db_config.sql'
2019-03-25 15:51:53,162.162 INFO Finished compressing file. Size: 59153792 bytes. Duration: 5797 ms
2019-03-25 15:51:53,162.162 INFO Running AWS upload for '/data/db/backups/2019-03-25_155122'
2019-03-25 15:51:53,227.227 INFO Uploading to s3. Bucket: "backup", Source: "/data/db/backups/2019-03-25_155122/db_config.sql.gz", Destination: "zabbix-db/2019-03-25_155122/db_config.sql.gz"
2019-03-25 15:51:59,819.819 INFO Uploading to s3. Bucket: "backup", Source: "/data/db/backups/2019-03-25_155122/db_schema.sql.gz", Destination: "zabbix-db/2019-03-25_155122/db_schema.sql.gz"
2019-03-25 15:51:59,915.915 INFO Successfully uploaded to s3. Time taken: 6753 ms
2019-03-25 15:51:59,915.915 INFO Retention is enabled, checking for old dirs
2019-03-25 15:51:59,916.916 INFO Backup directory has 2 dirs. Starting cleanup.
2019-03-25 15:51:59,916.916 INFO Following directories will be removed: ['/data/db/backups/2019-03-25_155122', '/data/db/backups/2019-03-25_154610']
2019-03-25 15:51:59,920.920 INFO Following directory has been removed: '/data/db/backups/2019-03-25_155122'
2019-03-25 15:51:59,921.921 INFO Following directory has been removed: '/data/db/backups/2019-03-25_154610'
2019-03-25 15:51:59,921.921 INFO Prometheus is enabled
2019-03-25 15:51:59,921.921 INFO Sending stats to prometheus gateway
2019-03-25 15:51:59,921.921 INFO Sending data to Prometheus host: "prometheus-pushgateway.vpc2.mnw.no", job: "zabbix_db_backup"
2019-03-25 15:51:59,929.929 INFO Successfully sent data to Prometheus. Time taken: 0.008221864700317383 seconds
Process finished with exit code 0
Tested with MariaDB 5.5
Never version should work, but there is no guarantee at the moment.
- Add option to specify string of options to be used with mysqldump binary
- Test with newer releases
Pretty straight forward.
- Clone the repo
- ???
- Profit
This repo also includes pre-commit config in .pre-commit-config.yaml
. If you know what it is and how to use it,
then you will also find useful requirements-dev.txt
, which includes the packages you need to run pre-commit config included.
Happy dumping!