Recently I developed a set of Perl modules and scripts to take some backups (files and database) and I uploaded the project to GitHub. So basically the functionality of this modules are:
- Take full and incremental backups rotated each week from a directory indicated from a remote machine using tar, so it’s necessary to configure a user to login with the remote machine without password and run the tar command on the remote machine with sudo.
- Take mysqldumps: by default it takes all the databases with the user has permissions and exclude the indicated.
- Take pg_dumps: take backups from the databases indicated and we can exclude the tables indicated.
So this scripts stores the backups in a base directory, indicated on the installation scripts and on the configuration files (in json format) and then it’s configured a retention day to delete the old backups. Each time that the backup finishes, will send a mail report with the time that take the backup, the current backups available on the system and if there are any backup rotated. To send the emails is needed to configure a local MTA to send the emails. So the installation (install.sh) will install on the system the necessary Perl dependencies, and the Perl modules and scripts needed to run the backups.
Basically with this set of modules we’ll take backups from remote machines and we can add all the machines that we want to backup and customize which databases, directories and retention time we want for each machine. You can access to the project from this url and learn more about it’s use:
I hope may be useful for you and of course you are free to comment if you see any error or improve for this tool. In the future I’ll improve the scripts to add more features depending on my needs.