Manual backups
I am planning to backup my site with a regular shell. How am I supposed to download the backups to my local computer? Is there a good way to do it or just with scp?
Thank you
13 Replies
Does it affect your site while downloading the files?
I don't think you can directly limit the bandwidth usage with rdiff-backup (it is possible, but unpractical), but with rsync, you have this
--bwlimit=KBPS limit I/O bandwidth; KBytes per second
As for what effect the backup will have, I guess it will mostly depend on how much resources the sites you are referring to needs during the backup.
To summarize, my guess is: Not a lot, but possibly, yes.
If you Google duplicity and Amazon Web Services you should be able to find a good example of the cron script to setup. If you can't find it I can post mine. You can also encrypt the backups, so your private stuff stays private.
Can't you use duplicity with a local PC? Do you need Amazon S3?
Restore would take a few hours as it's backed up to a DSL with 512 kbps upload, but as I'm a noncommercial host, it's acceptable.
If you need fast backups, well, look into Linode's offer; S3 can be a double-edged sword.
LATE EDIT: Accidental double negation fixed.
@rsk:
gzip -9 –rsyncable
Whoa, thanks! Had never heard of the –rsyncable flag before.
@Vance:
@rsk:gzip -9 –rsyncable
Whoa, thanks! Had never heard of the –rsyncable flag before.
And it actually works quite good.
I use rsync over ssh.