Looking for the best way to keep data on servers safe
We know we should all be doing it, but most of us don't do it enough.
I put out a request today to my followers Twitter asking this question:
I've had the following services recommended to me by a number of fellow developers whose opinions I have a lot of respect for:
A couple of ex-Headshifters pointed me to Duplicity, a free tool that also looks very promising. Using it looks pretty straight forward.
duplicity /home/my_directory scp://email@example.com//usr/backup
A variant of it can be used to backup to Amazon S3 too, and there's a helpful blog post to show how to do this here, if you use Ubuntu or OSX.
One service that looks really interesting is though is Tarsnap, as recommended by [Jon Gilbrath]. It's similarly simple to use, but takes care of the some of the more awkwards backup issues, and saves you having to setup your own S3 account.
# Create an archive named "mybackup" containing /usr/home and /other/stuff: tarsnap -c -f mybackup /usr/home /other/stuff
The developer, Colin Percival also has written quite extensively about how it works on his own blog - he's only making a few cents per gigabyte on providing this service for people, yet it still looks to be something viable for him to run - amazing.
It looks like it's almost exactly what I'm after - the only thing missing is the option to back up storage inside the EU. This requirement is mainly one coming from data protection concerns from previous clients, because rules for processing data and storing in in the EU are different to that in the US, but given the strength of encryption, I'm not sure how much of an issue this really is, these days.
I'd really appreciate some light shed on this actually - technology is moving so much faster than law these days, it's frustrating not being able to take advantage of these kinds of services.
Anyway that's what I've found. What do you use, and why?