amuck-landowner

Help me with my backup plan?

Artie

Member
So, I'd like to get some backups going because I've realized while I don't really need the data and can restart if I lose every single piece, it would be a pain in the ass to get going again. Seems smarter to avoid all that work by just setting up some backups. It's a gaming community so it's not really mission critical or anything that could result in income loss to me.

Ideally I'd get a storage VPS from the numerous providers offering one and store all of the backups on it. The question is what technologies could I use to backup this stuff?

Gameservers

I don't have admin access on these machines. The backups would have to be done via FTP. I'm not sure if it's possible to do incremental backups via FTP however. Nor do I know of any FTP backup technology.

Website

I have root access to the machine hosting the website. It's powered by VestaCP and the panel seems to generate backups. Should I backup it's backups or have it done another way?

Databases

These are hosted off the website's machine. They are also included in the VestaCP backups so I'm not sure if I should just backup them separately or if I could the route of backing up VestaCP's backups which include them.

Lots of things to think about, suggestions appreciated!
 

HalfEatenPie

The Irrational One
Retired Staff
rsnapshot is the first thing that comes to mind when I think about full-server backups (and I bet @KuJoe would agree?).

Honestly just ask KuJoe, I'd consider him the backup master around here since he has like 200 backups at any time. 

I'd suggest start with the backup "script" on SecureDragon's Knowledgebase: https://securedragon.net/clients/knowledgebase/20/HOW-TO-Automatically-backup-your-user-data-with-rsync.html

Back them up to several VMs.  In addition have one VM regularly cron-zip the backups or something (and cycling it out).  

And then setup one of the VMs to have Crashplan.  Then boom you have a pretty good system setup.

Or go Googling.  Github has some awesome pre-made scripts (Search "Backup Script site:github.com" or something) that people use.  Go looking around there.  

As for the Servers/VMs, since we're going more for redundancy than anything else, I'd suggest a Kimsufi (Right now they're provisioning them with 2 TB HDDs instead of 1TB), a RamNode/BuyVM/SecureDragon/Backupsy/NoDisto/Cloudshard/Catalyst/etc VPS.  

Also check every once in a while to make sure you can recover from the backups.  

Worst case scenario, just upload it all to Amazon Glacier.  

Oh and it's not going to be the cheapest, but it should save your ass in the time of need.  

tldr: ramblings of a very tired man who just doesn't care to proofread
 
Last edited by a moderator:

JahAGR

New Member
Vesta's built-in backup facility is alright. It's nice because you can restore one on a fresh installation and it will rebuild all the web configs and handle the databases for you (and it works -- I've used the backup thing to migrate sites to another server before). The backups are owned by the admin user so theoretically if a site gets compromised they won't be able to blow away the backups.

If the website on vesta is super important then I'd say maybe write your own backup routine just because you have much more control over what happens. Personally the stuff I'm using vesta for isn't mission-critical so I just rsync /home/backup to another server every night.
 
Last edited by a moderator:

mojeda

New Member
For the VestaCP I would just rsync the /home/backup server to another offsite, server whether it's another VPS with large storage or a small dedicated server like kimsufi with a 2TB drive in it.

From the secondary box you can then use something like s3cmd which can "rsync" everything to amazon s3. You could then in Amazon S3 tell your bucket to push backups to the glacier for anything older than say 2 weeks or 1 month (or however long). Remember while glacier allows you to store a lot of data for cheap, having to "thaw" your backups will be slower. S3 is still pretty cheap in my experience, and you won't really have a speed penalty if you need to pull down a backup that's only a few days old.

Using Amazon S3 is nice for long term storage, where your kimsufi or other VPS server could be used for immediate backup restores it's also a backup of your backup should your secondary backup system fail.

Also remember no backup is a backup until you know it works. Make sure to check your backups and ensure everything is in order.
 

winnervps

New Member
Verified Provider
I heard hubic is testing its Linux's interconnection. So you might be able to try it using your FTP.

25 GB for Free
 

HalfEatenPie

The Irrational One
Retired Staff
Backing up backups seems overkill for my needs.

Also, nothing for the FTP part of my issue?
Ahh my bad man!

In terms of FTP, I'd suggest looking at these:

http://serverfault.com/questions/236434/incremental-backup-up-from-a-remote-ftp-box-to-a-windows-server

http://www.iperiusbackup.net/en/make-incremental-ftp-backup-upload-iperius/

http://www.duplicati.com/

Of course you're welcome to whatever you want to do :p although the general accepted "rule of thumb" of backups is like always have three locations or something.  Because when shit hits the fan and some of your backups may be messed up, you'll still have other "sources" to go to.  Also, would help improve coming back from a major failure.  If I recall correctly, certain services are limited in download speed (when you're pulling the backup), therefore a large amount of data could take hours (or days even) to download whereas another backup location could recover it much faster. 
 
Top
amuck-landowner