Everyone has heard the mantra "make regular backups", but how many people really do it? And how is it done? I have just started using duplicity and the ftp space provided by my web host, Strato.de to make my backups. I've also experimented with mondorescue which was very handy for making bootable iso images of my server. Here's the recipe for duplicity + Debian Etch + ftp. What's your recipe?
Making full and incremental backups of a Debian Etch webserver using ftp and duplicity
These instructions are only slightly modified from this article on the lovely Drupal-powered Go2Linux.org.
- Install pgp.
apt-get update apt-get install pgp - Install duplicity.
apt-get install duplicity Generate a key.
gpg --gen-key
This command will take you through a series of questions which you must answer. It will ask you for a pass phrase. Remember this! Write it down somewhere so that you never forget it. Otherwise you won't be able to use your backup to restore any data.
Furthermore, during the key generation the gpg program uses system entropy to generate random numers. An idle system doesn't generate enough entropy to satisfy the programs needs for randomness. I had to open a second shell and run grep -r "Hello World" / in order to have the system be active enough to generate entropy. If you run into this problem, just open a shell, type the grep command I suggest, and wait for gpg to finish. It will eventually notice and keep going.
Look at your available keys.
gpg --list-key
The results will resemble this:
/root/.gnupg/pubring.gpg
------------------------
pub 1024D/67F85ABC 2007-07-04
uid Robert Douglass (Hello World) <robert@example.com>
sub 1024g/6C
You're interested in the 67F85ABC bit.
#!/bin/sh
export PASSPHRASE=Ididntforgetmypassphrase
export FTP_PASSWORD=s3kret
duplicity --encrypt-key "67F85ABC" --sign-key "67F85ABC" \
--exclude /sys \
--exclude /mnt \
--exclude /tmp \
--exclude /proc \
--exclude /dev \
/ ftp://username@ftpserver/directory \
| mail -s "duplicity backup report" robert@example.com
Note the next to the last line of the script is the from=>to part of the command. In my case, since I want to back up the entire server, the from part is / (the root of the file system) and the to part is the ftp connection string for my ftp server. One GOTCHA I ran into here was the directory bit. I tried without it but got an error. As a minimum you have to provide / as the remote directory.
The last line of the script tells duplicity to direct its output to the mail program. Mail will send an Email to robert@example.com with the results of the backup. This is how you can keep track of whether your backups are going smoothly or not.
Make sure your script is executable:
chmod u+x /usr/local/sbin/duplicity-backup
cd /usr/local/sbin
./duplicity-backup
It took a while to run, but I was able to confirm that duplicity was sending things to ftp by logging into the ftp server and looking around. The next step would be for me to restore something... I'll do that next.Automate it! Since I don't want to run this command every time I want a backup made, I've set up a cron task to run the script for me every night at 3:15 AM.
crontab -e
... and then enter something like this:
15 3 * * * /usr/local/sbin/duplicity-backup > /dev/null 2>&1
I'll get back with updates on how it's working out for me. In the meantime, has anyone got a successful Amazon S3 webserver backup working? What tools did you use? I spent a ton of time looking at things and nothing worked for me as well as this solution. I'd really like to utilize S3 for this task, though, so please share your experiences.