Home backup

One year ago, I had a major cleanup of my home computers and server. Part of that was to get good off-site backup and so a co-worker recommended IDrive. They worked fine, and for a good price, but for Ubuntu there was only a web-based UI and it didn’t save previous versions of the backed up files. Not horrible, but annoying.

A week or so ago I discovered that I didn’t have a crucial folder in the backup set (!) so I ran their local Python scripts to add it. Script spits out a generic Script error along with a message to upgrade to a new version. New version installed but was even more broken:

Use of uninitialized value in string ne at Helpers.pm line 265.

I contacted them and after a few days (holiday season) they emailed back that I could schedule a time to get with their back end team. Again, not horrible, but it pushed me to look for a more home-grown solution.

Step 1: Copy local files to a network drive

Web server pages and wiki content gets copied daily to a backup folder on a Drobo DAS volume. Four 2 TB drives RAIDed to ~5-1/2 TB. Check their handy capacity calculator. I’ll need to upgrade to an actual NAS someday.

Step 2: Export databases to a network drive

MySQL databases exported with mysqldump every few hours, named with the hour and gzipped (hour_database.db.gz). The hour thing was used because IDrive didn’t give file history, so that can probably be skipped, but it’s nice to have quick access if needed.

Step 3: Generate versioned backups

I use duplicity to take the web server folders, database exports, and various document, photo, development project, and music recording folders and back them up to another location on the Drobo volume. First full backup is slow, subsequent incremental ones are quick.

Step 4: Copy versioned backups offsite

This is the goal. Duplicity can backup direct to an offsite location, but my initial set up uses rclone to do the copying. They support a billion types of storage (Amazon, FTP, Azure, …) but I decided to upgrade my Google Drive to 100 GB for ~$24 a year and use the rclone Google Drive commands.

Each step is a crontab-ed script with taged log output:

2019-01-02 05:00:29 [db_backup] Exporting database-a to 00_database-a.db.gz
2019-01-02 05:00:29 [db_backup] Finished database backup
2019-01-02 05:05:16 [www_backup] Backing up /var/www/site1 to /drobo/site1
2019-01-02 05:10:32 [www_backup] Finished www backup
2019-01-02 06:00:01 [duplicity_backup] Starting duplicity backup from /source to file:///dest

There’s room for improvement, but as a New Year’s upgrade I’m pretty happy.

Leave a Reply