Back up files to email?
All the files are in one /websites directory. Simple enough. And they're all relatively small.
I'd like to recursively zip or tar.gz that dir and immediately fire that file off to my email account (on gmail) once a day.
I can do the zipping. I can do the scheduling. I just can't do the email sending. Any tips there?
I'm also not sure if what I'm planning is a good idea. I think I'd rather have many duplications in my gmail account (I'm allowed 25gigs and easy they're enough to clean up) than various rsync diffs that need a PHD to restore.
6 Replies
The file was successfully sent!";
} else {
die("Sorry but the email could not be sent. Please go back and try again!");
}
?>
!/bin/bash
tar -zcf /tmp/backup.tar.gz /home/calvin/files
echo | mutt -a /tmp/backup.tar.gz -s “daily backup of data”
Just save it to /usr/local/bin/sendEmail and make it executable. It will give you usage if you run it without arguments.
I think I won't be doing the GmailFS route for one main reason: I don't really want my server being able to access my Gmail. That's one of the main reasons I don't host my own email on my VPS; I don't want my entire life rooted if one of my VPSs does.
The fool would edit his config. I fixed the attachment.
Here's the command I'm firing off:
rm sites.7z &&
7z a -t7z -m0=lzma -mx=9 -xr@ex sites.7z /websites &&
echo | mutt -a sites.7z -s "VPS Backup" me@email-address.com
And I've got a file called ex that ignores things I couldn't care less about losing (for fairly obvious reasons):
.bzr
.svn
.git
*.pyc
amedia
logs
*.log
Using 7z over gzip made a 60% difference, getting the attachment well under the sendmail barrier.