Ask Slashdot: It's World Backup Day; How Do You Back Up? 304
MrSeb writes "Today is World Backup Day! The premise is that you back up your computers on March 31, so that you're not an April Fool if your hard drive crashes tomorrow. How do Slashdot users back up? RAID? Multiple RAIDs? If you're in LA, on a fault line, do you keep a redundant copy of your data in another geographic region?"
RAID is not a backup solution (Score:5, Insightful)
Simple. Redundancy backup.
Re:RAID is not a backup solution (Score:4, Funny)
Who needs RAID? My hard drive is so large that I just backup my files in a different directory :P
(Note: this is a joke, but sadly, many actually follow this "strategy")
Automated backup of NAS (Score:4, Insightful)
Re:Automated backup of NAS (Score:5, Funny)
Hello Kitty [google.com] USB flash drives.
Drop a bunch in the parking lot.
Use Google to get the data in a couple of days. Latency is a bit low, but hell, it's a backup.
Re: (Score:2)
As long as it is not the only strategy, it's actually a good one: It's easy enough to perform it very frequently, protects you against most user mistakes (accidentally overwriting an important file, for example), and allows quick access to the backup.
Re: (Score:2)
Re: (Score:2)
for "oops i deleted the file" or "oops, i replaced some paragraph of text with a dumber version when i was drunk" this is quite useful backup. Easy to do, easy to recover. Only hard to remember and not automated. But this can be automated as well ...
but then its going to be something more advanced, as rsync ./ /backups/$(date +%Y-%m-%d)/ anyway. so you got from stupid backup solution to inventing something good.
of course it still does not cover drive failure.
Re: (Score:3)
Take a look at rdiff-backup. It easier to manage.
Re: (Score:3)
Simple. Redundancy backup.
Agreed a 100%. Here is my backup setup:
First, my computer setup:
1. I have a home server that doubles as a NAS on the basement. It has a RAID-5 array for redundancy.
2. I have various other computers, including my own, my SO's and one server at a hosting company.
3. Documents usually sit on my NAS.
4. Documents for which I need an offline access (of sharing with my SO) sit in a Dropbox.
5. Dropbox is also installed on my home server.
My backup plan:
1. My wife and myself have a "button" on our laptops to run the b
Re: (Score:2)
That sounds really complicated.
I just put my car in reverse and look over my shoulder to make sure I don't hit anything.
Re: (Score:3)
For what it's worth, Crashplan can handle a good deal of what you want to do without any hassle. Won't cost you anything either, other than the download/install time.
Comment removed (Score:5, Insightful)
If it's not off site it's not a backup ! (Score:3, Insightful)
It's a raid.
Re: (Score:3)
I use... (Score:2)
The backup gem( https://github.com/meskyanichi/backup [github.com] ) + an dedicated server + some cron processes.
Re:I use... (Score:5, Funny)
Amateur. I take polaroids of my platters and store them in a safe deposit box.
Re:I use... (Score:5, Funny)
That's nothing. You should see my butterfly collection...
Time Machine (Score:5, Informative)
It backs up to a Netgear Readynas configured in RAID 5. Hourly, daily, weekly backups. I've never lost anything thanks to this great system.
In linux I try to approximate this with BackupPC.
http://backuppc.sourceforge.net/
It is really an excellent piece of software, though no where near as refined of course. You pretty much only get daily backups though since the kernel in linux does not track filesystem changes so hourly backups would be very prohibitive.
Comment removed (Score:5, Informative)
Re: (Score:3)
Re: (Score:2)
Watching a directory tree or mount point is possible with the new fanotify API. You define a mark -- from the looks of it, basically a pattern against which events are matched.
https://www.linuxquestions.org/questions/linux-kernel-70/fanotify-howto-recursive-mark-directory-changes-918231/ [linuxquestions.org]
https://lwn.net/Articles/360955/ [lwn.net]
https://lwn.net/Articles/339253/ [lwn.net]
This wasn't possible with inotify for an arbitrarily large number of files -- anyone who tried will remember inotify has to recursively scan and registers all f
Re: (Score:3)
An API like gimme_filesystem_changes_since_i_last_checked() is the kind of thing needed for an sane incremental backup program. I suspect Time Machine would do something like this (don't know for sure, since I'm a Linux guy...)
The Darwin kernel API for this is pretty nice. It doesn't track individual file changes, but instead just notifies of changes to directory contents. The notifications are sent to fseventsd, which consolidates multiple changes to the same directory that happen in a short period of time, then logs each directory that changed. Apps (like TimeMachine) can then ask which directories saw changes during arbitrary time intervals. It's then on the app to figure out which files changed and in what way, so it does
Re:Time Machine (Score:5, Informative)
I'll second this. I use Time Machine too. I don't have any fancy NAS box for it (due to budget mostly) - I just use an external firewire disk right now, and it has been used once due to a full internal drive failure (restoring the iMac back to the state it was in an hour before the failure) as well as the occasional single file recovery.
Most back up systems work well for full system recovery - Time Machine is not unique there - but it's the single file recovery tool that really makes it shine. It's very simple and intuitive to use.
It is totally "hands off" though - you have to trust that it actually is doing what you tell it to, beyond the menu item that gives you a summary of what it's up to (total being backed up at that moment, last backup time etc). It doesn't have a "show me a list of files backed up at x time" feature without the use of third party tools, so people who really want peace of mind may find that annoying.
Re: (Score:3)
Put in your disk, wipe the drive make a new partition, and install by 'restoring from time machine'
It will make you system how it was from the minute that backup was made, almost as if nothing changed.
As mentioned, the tmutil command line utility is great for finding out exactly what is in each backup, what changed(added and removed), what kind of size diff there is between two backups, etc.
You can also incorperate an old sparsebundle int
Re: (Score:2)
I've never lost anything thanks to this great system.
The real question is have you ever actually needed it?
I can say "I never drove my car into a volcano thanks to my awesome GPS!" but if I was never in danger of driving my car into lava, it would be a pretty pointless statement.
Re: (Score:2)
Time machine works fine. I've migrated machines with it, re-installed and restored from it, and retrieved old copies of files from months ago from it.
Time machine is one of THE features of OS X that linux peeps should be copying the shit out of, rather than trying to do some cheap rip off of Aqua as the next UI flavour of the month.
Re: (Score:2)
Windows has had Volume Shadow Copy since XP which does the same kind of thing, it just does not have a flashy interface.
Re: (Score:2)
Don't forget restore, is just as important. (Score:5, Informative)
Backup is only half the problem. Restore is the other half. And indeed that's where I've usually had the most problems. The third problem is validating the restore. You always worry that you are either going to overwrite something on the restore target or miss something on the restore source and end up in an inconsistent state.
Time machine is revolutionary because it is so simple and seems to be almost flawless. I've had lots of backup systems over the years including dump 0 but everyone has been plagued with issues that arose when things were off normal. I've cobbled all sorts of things like rsync and cpio but the only thing that comes close to working as flawlessly as time machine is a NetApp.
At work where I can control the remote servers securley on a closed network I am able to use time machine for a remote backup. But at home I don't have a remote server I can target for the remote backup.
TO do a remote bakcup at home I use Crashplan. I looked a lot of competitors like Mosy but settled on crashplan for two killer reasons. The giant problem with all these commercial backups is that while the incremental backups are simple over the net, the restore of a whole hard disk cannot be done over the net. You have to pay them to burn DVDs and send them to you. ANd that assumes you know what time period you want to recover.
UNlike all the other methods crashplan lets you pick a buddy who runs crash plan and then you can back up your disks to each others computer. If you need to to a massive restore you just drive over to your buddy's house and pick up the drive, bring it home, and restore locally. This also solves the problem of the first dump being too large to send over the net as well. You do it locally then drop the drive off to your buddy.
Brilliant!! plus with crash plan you pay for the app once not monthly.
I've used it for years now and it works very well and it very easy to set up. All your files are encrypted so buddies can't read each other's drives.
The only flaw with crashplan is that it runs in java so you have this instance of java running 24/7 and not to put to fine a point on it: java sucks. I don't know if it is crashplan or other things that run in the JAVA VM but over the week it bloats up to 600MB to 800MB. THe workaround solution is to kill the java VM every few days. Empirically crashplan is robust enough to survive this and restart. But that's a really awful solution.
Re:Time Machine (Score:5, Informative)
This.
I have (and still do) use all sorts of various systems for backups both at home and at work, and Time Machine is by far the best. Completely invisible, automatic and smart. You can turn off your computer mid-backup and it just continues when you turn it back on. It is so much better than the alternatives, I'm surprised how little limelight it gets.
Perhaps just as important as the backups: it has a great UI to access said backups! One click gives access to a file at any date in the past you want.
Re:Time Machine (Score:4, Informative)
I use it with external USB drive and it has saved my butt couple of times. Cases where I thought the focus is in certain nautilus window, then doing Shift-delete + enter in very quick fashion and fraction of a second later realizing there was another nautilus window with focus on some directory which is now nuked... As this is just a frontend to rsync and uses hard links, there is the advantage of the backed up files being available even without the backup program as normal files within the directory structure on the backup media.
How Do You Back Up? (Score:5, Funny)
With a loud beeping noise.
Comment removed (Score:5, Insightful)
Re: (Score:2)
Multiple routes (Score:3)
I currently sync my files across three computers, each of which does a time machine backup. The files are also backed up via Jungledisk to Amazon S3. Occasionally I do full-disk images of things.
Files that would be inconvenient to lose, but which are not irreplaceable, are stored on a Drobo (redundant drive enclosure). This includes, for instance, my music library which could be reripped from CD.
Poorly. (Score:3)
Between 3 active computers I use, there's enough redundancy since they're rarely in the same place. SpiderOak manages absolutely, completely vital stuff (currently my thesis drafts).
But there's no real, constructive and useful pattern to it yet. The problem is less backups and more change management. Keeping copy-on-write sane on Windows is difficult, and migrating my servers XFS partition to ZFS is problematic since I need just tons of storage to do it which I presently can't afford.
The issue is far less "backups" and more "making them meaningful". Backing up is useless if I overwrite the media with the important changes, or it takes forever to dissect a working copy of the data.
cron job (Score:5, Funny)
My weekly backups: something like:
0 0 * * 0 /home/me/backup.sh
#### backup.sh ##### /dev/null
cp -r home/me/*
I haven't missed a backup yet :-)
Re: (Score:2)
Goes pretty quick too...
I use (software) RAID-1 for my /home directory, just reinstall the OS and apps. If my raid totally dies I guess I'm mostly screwed, but truly important stuff like tax returns I keep a copy on a different box (also w/ RAID /home) and a paper copy in a fire resistant safe.
Re: (Score:2)
What do you need RAID for at home? Would it matter if your home computer was down for a few hours?
RAID is not for backups, it provides continuous operation if a hard drive goes down. I really doubt that many many home computers need this.
Since you already have 2 hard drives, a much better use for the second is to remove it from the RAID set, put it in an external enclosure and use it for backups. In an external enclosure,
Re:cron job (Score:4, Funny)
My weekly backups: something like: /home/me/backup.sh
0 0 * * 0
#### backup.sh ##### /dev/null
cp -r home/me/*
You should make a restore.sh script to match this. Then test it...
Re: (Score:2)
Haha. I worked on a cluster that pxe booted off the head node once. Would have been great to add it to the pxe script: 40 servers all installing linux and cluster software then /dev/null it all :-). I don't understand the systems go down for ~20min come up and then crash :-)
Depends (Score:2)
For our personal computers, we use Time Machine - but manually triggered.
For the media server, I've got a second disk and a once-a-day cron job.
Offsite... well, okay we're not really there yet. So we're covered in case of hard drive failure, but not a catastrophic fire.
(I am assuming this question is about home, not about work)
shell script (Score:3)
I basically use this shell script once a week:
drive=/backup/drive
bpaths=/some/paths
for d in $bpaths ; do
dout=`echo $d|sed -e "s/^.*\///"`
echo Backing up $d as $dout
ionice -c3 rm -f "$drive/bkup/$dout.*z"
ionice -c3 tar -c "$d" | gzip -c | ionice -c3 openssl aes-256-cbc -salt -out "$drive/bkup/$dout.tgz.aes" -pass pass:"WouldntYouLikeToKnow"
done
I then copy the data to my USB drive on my keychain if it's plugged in. (Hence the encryption.) I also have a scheduled task on my laptop to copy the data from my desktop the next day.
Re:shell script (Score:5, Funny)
I just noticed I needed quotes around the bpaths variable assignment. Furthermore, my backup script has been broken since January!
Thanks, Slashdot, for making my look at my script!
Comment removed (Score:5, Informative)
Suspenders AND belt (Score:2)
My house is full of Macs, so I use Time Machine for on-site backup - each machine has its own Time Machine drive dedicated to it. Each machine also runs nightly image backups using SuperDuper onto yet other drives dedicated to that purpose.
All info is also backed up offsite. I use CrashPlan Pro, which backs up over the net to their servers somewhere in the American Midwest (Milwaukee?) - in the event of a fire or a giant sinkhole opening up under my house, I can get the full contents of all my computers shi
External Hard Drives (Score:2)
Let the government back up for you (Score:2)
Encrypt any important data that you don't want to lose, and keep it close to crazy terrorist rants in plain text files. With all the government snooping going on, in the interests of our security, the government will secretly make backup copies of it for you. Their experts will try to decrypt it, and be baffled by the message hidden in your wedding video.
In case of a disk crash, just ask the government politely to give you a backup copy of your data. They will kindly oblige.
Probably.
Backups are for the weak ! (Score:5, Insightful)
When you boot up in the morning and it takes a little longer than usual, the heart beats a little faster and you think "OMG is the machine going to fail? My data will be gone". Or perhaps there's an electrical storm to liven your day up - "If that thunder gets any closer I might have to shut down the PC, but if lightning hits then everything's toast !".
These scenarios, and many others, all get the blood pumping in fear. If the computer
Try it - it's fun
BackupPC & CrashPlan+ for teh win (Score:2)
I use a combination of BackupPC and CrashPlan+ Family Unlimited to keep the data on all my systems safe. Works like a charm.
Tahoe LAFS (Score:5, Interesting)
I use a secure distributed grid. The software is an open source tool, Tahoe LAFS (http://tahoe-lafs.org [tahoe-lafs.org]). The grid is composed of ~15 servers contributed by different people all over the world. There are a half dozen servers in various locations in the US, about the same number in Europe, and the remainder in Russia and the Ukraine.
My files are AES256-encrypted on my machine, split into 13 pieces using Solomon-Reed coding, any five of which are sufficient to reconstruct my files, and then those 13 pieces are distributed to the servers in the grid. I run daily backups, but since uploads to the grid are idempotent, only the changed or new files are stored. I also run a bi-weekly "repair" operation which checks all of my files (all versions, from all backup runs) to see if any of their pieces are lost. If so, it reconstructs the missing pieces and deploys them to servers in the grid. The individual servers in the grid are fairly reliable, but problems do happen, so repair is important.
I get about 100 KBps net upload rate, so this isn't a good solution for backing up terabytes, and the occasional "surge" in my data generation (usually caused by a day of heavy photo-taking) often causes my "daily" backup to take a few days to run, but all in all it works very well.
Should my server ever die, I only need two pieces of information to get all of my data back: The grid "introducer" URL, which will allow me to set up a new node connected to the grid, and my root "dircap", which is a ~100-byte string containing the identifier and decryption key for the root directory of my archive. That directory contains the decryption keys for the files and directories it references.
Since this grid is all volunteer-based, the only cost to me for this backup solution is the hardware and bandwidth I provide to my grid (I provide 1 TB of disk and grid usage consumes a fairly small fraction of my Comcast connection), plus the time I spend administering my server and checking to see that my backup and repair processes are running. Oh, and I also contribute (a little) to the Tahoe LAFS project, but that's due to interest, not a requirement.
I'm very, very happy with this solution.
BTW, the grid could use another 20 nodes or so, if anyone is interested. There's a fair amount of trust required of new members to the grid, though, so it might take us a while to vet new members. The trust is required not because other members of the grid might have access to files that are not their own, but we need to verify that new members will behave appropriately -- providing their fair share of storage and bandwidth, and not consuming too much.
Anyone interested should check out the grid's policies and philosophy at: http://bigpig.org/twiki/bin/view/Main/WebHome [bigpig.org]. If all of that looks good, join the mailing list, introduce yourself and we'll consider allowing you to join the grid.
Screw that (Score:2)
My daddy thought me good! (Score:2)
Multiple places (Score:2)
On top of that my most important text documents(not necessarily important in the way of having personal information, but a lot of work put into) are backed
Thumbdrive (Score:3)
Think through what you're backing up and why. For most people a thumbdrive should be sufficient for personal data; software can be reinstalled as needed. If you have more data than will fit on a thumbdrive you need to look at what's important.
Really large volumes of data almost always are static; usually music, eBooks, or video which can just be backed up once on a DVD and put away. No need to keep copying that stuff over and over.
Backing up software projects is another issue. A remote versioning site is best. Working in Java you'll need all the space you can afford; for a language like Python an old floppy drive is sufficient.
RAID 5 + external hard drive (Score:2)
I use just a three-level hierarchy:
1. Photos and documents are on my RAID-5 array (4 × 1 TB Hitachi enterprise drives) in my desktop, backed up occasionally (every month or so) to a Toshiba 1 TB eSATA external drive sitting on my desk
2. Music, movies, TV shows, are on the RAID-5 array, not backed up
3. Windows and programs are on my 80 GB SSD, not backed up.
So I'm not protected at all against my house burning down, but this has worked for me for the past 10 years. (For my old system, which ran 2003
Areca (Score:2)
More important (Score:3, Insightful)
World backup day? How about world test your restore day? All the backups in the world don't mean anything unless you test your restores and know your data.
Re: (Score:3)
Re: (Score:2)
If your time and data is valuable enough, the cost of another drive is not a big deal even post thailand flood.
The Tao of Backup (Score:2, Informative)
Killthre... I mean The Tao of Backup [taobackup.com]
My backup schedule (Score:2)
Mon, wed Fri: Ghost Backup of OS and C: drive to a separate drive
Tue, Thu: filecopy of all required files (Pics, mp3s, important data to same separate drive
Daily scheduled task that runs and will mirror said separate drive to another drive that is mounted as Y:. This drive is stored at work and is fully encrypted using truecrypt. Usually bring it home on Mondays but if I forget, it will automatically run any other day
rsnapshot (Score:5, Informative)
rsnapshot [rsnapshot.org] seems to work pretty well for incremental rsync'd backups for me. It uses symlinks to maintain the older snapshots, to save on total filesystem usage. It can do rsync over ssh for backing up remote servers/pushing local vital data to a safe remote location.
Local backup server uses Linux software RAID for good measure (5x1TB RAID 5 + 10x2TB RAID 6).
Backups - and It's also Earth Hour .. Day. (Score:2)
It's also the day of Earth Hour - a day on which, for one hour, people around the world turn off the lights.
http://en.wikipedia.org/wiki/Earth_Hour [wikipedia.org]
I wonder how many here personally partake.
-----
As for backing up goes, I'll just re-paste here what I said in a recent other Ask Slashdot question:
It's not really 'managing' my data.. it's a storage/backup solution. The difference is that if I 'managed' my data, I wouldn't have tens of thousands of digital camera photos in a bunch of folders with meaningless name
Re: (Score:2)
Backing up as we speak (Score:2)
Ironically a Harddrive in my NAS died two days ago, so I have had to do a rebuild and as a result bought a new external hard drive that is big enough for all the data, even the easy to replace data. So currently my NAS, which rebuilt the array successfully is copying all of its data to a new external harddrive.
Free Backups! (Score:2)
Chinese espionage hackers do it all for us free. They copy our stuff over to their side. It's as off-site as you can get.
RSync/SSHFS/EncFS (Score:2)
For laptops, I use a scheduled rsync to a central server mounted using sshfs. For offsite, an rsync to an EncFS filesystem on a portable drive. If bandwidth limits ever get reasonable, I'll switch to using DropBox or SpiderOak, but the bandwidth limitations remove that as a solution for all but important data.
AMANDA and Windows Backup (Score:3)
I have two systems I use.
For my servers, I use AMANDA [amanda.org] with encrypted virtual tapes to do nightly backups. Shortly after the backups run, cron calls a shell script in order to copy the virtual tapes to an offsite location via rsync.
For my desktop PC, I don't need to back up as often, so I do a weekly backup via Windows Backup to a TrueCrypt volume on an external hard drive. When it's not being used to back up my PC, I keep the external hard drive at my office. I figure if something happens where both my office and home are destroyed, then at that moment I've got bigger problems to worry about than my data. :-)
Just my $.02...
Rolling 11's fwiw.. (Score:2)
mozy (Score:2)
very little is worth backing up. That which is gets uploaded.
time machine + rsync = win win (Score:2)
I've been using rsync to back up 1/2 dozen machines here for some time now. Great for both local and remote (internet) backups. My mom's imac is 300 miles away. She has a little 500gb firewire drive locally backing up with time machine, which gives her instant recovery as well as application-spicific recalls (get that email back or that address book card back easily for example) as well as versions. Then a custom made cronjob to rsync to my server here runs nightly for offsite in case of fire/theft/etc.
I just don't backup (Score:2)
Seriously, who cares about what happened 5 years ago?
When did you last look at that porn clip you downloaded in 2002? How many emails do you really need to keep for more than a month? Would Battlezone 2 even run anymore if you tried it?
When did you last get audited by the IRS? Screw that, my accountant should have copies anyway, and most info was electronically reported anyway.
Between the cloud, my iPhone, 4 synced home boxes, a drop box for some important stuff, Google, I can recover anything non-trivial.
Reverse (Score:2)
Time Machine - it just works (Score:2)
Time Machine to a 4 TB drive on my main computer. For my wife's machine and my laptops, Time Machine to a 2TB Time Capsule. Haters can hate, it just works.
3 layers (Score:2)
Rsync to a local Drobo with Drobo RAID, a Local fiber channel array with RAID 5, and to a remote NAS with RAID 5, every night.
For Linux - Back in Time (Score:2)
It's like the Linux version of Flyback or Timevault.
http://backintime.le-web.org/ [le-web.org]
Multiple HDs w/Limited Data (Score:2)
For a single computer only, it can be fairly inexpensive with 2-3 TB HDs.
1) Clone the entire working drive once every one - two weeks, so I can go back to a working OS if the OS is corrupted as that takes a day to reinstall all OS, Apps & Utilities and migrate data back. Hence, this is a recovery point for the entire HD on the time period one picks.
With cheap 2-3 TB hard drives, one could actually clone the HD every day for a lot of people & then overwrite such that you had a complete 2 weeks of cl
bup + ssh (Score:2)
Earlier I used backup2l to first make local backup and then rsync to server. The only problem was it was wasting disk space on each host, specially laptops.
Recently I moved to bup [github.com], provides more efficient backups with very small local storage. Now I have in every laptop, desktop and my email server (all running either Debian or Ubuntu) in /etc/cron.daily/bup-backup:
#!/bin/sh /var/mail /home /var/lib/mysql
echo Backup starting at $(hostname) $(date)
bup index -u
bup save -r backups.example.com: -n $(hostn
Re: (Score:2)
And of course forgot configuration at server end; in /home/bupups/.ssh/authorized_keys there is a line for each host:
command="BUP_DEBUG=0 BUP_FORCE_TTY=3 bup server",no-port-forwarding,no-agent-forwarding,no-X11-forwarding,no-pty ssh-rsa AAAAB3NzaC...
To a time capsule and external drive. (Score:2)
DD Backup - Fastest Recovery in Town (Score:3)
My backup for a multi-boot laptop that other solutions (e.g.: running from one OS) don't seem to work for:
1) Buy a second copy of your main hard-drive + USB Interface (SATA enclosure)
2) Boot Linux on computer using CD
3) Use dd to mirror entire HD to external HD. Run before you go to bed, setup to shutdown when done. Save stdout/stderr somewhere like a USB flash drive.
4) Wakeup to a backup.
The advantage of this is when your hard drive fails, recovery is about 60 seconds away. Swap out one hard drive and you are done. Or you can recover specific files by just using the backup HD like a normal external HD, since everything is just under normal filesystems. If you'll be on business for a while take your second hard-drive with you (try to store somewhere it won't get stolen with laptop).
I actually keep two mirrors, partially because of travel and wanting to have one backup with me. This also makes sure that if your computer fails half-way through doing the mirror due to a power surge it doesn't fry your original + mirror. Keep one at a friends house or similar.
btrfs RAID 10 and time machine (Score:2)
I'd like to get time machine going to the ubuntu server as well but so far I'm just using external us
Oh, God (Score:2)
The premise is that you back up your computers on March 31, so that you're not an April Fool if your hard drive crashes tomorrow.
Thanks for reminding me to stay away tomorrow.
git (Score:2)
Everything worth being backed up I am working on is pushed to a git repository. The git repositories are synchronized manually (but pretty much after each important update) across about 5 machines in 3 physical locations (home, work, computing center). Though they are in the same city.
I don't think I will lose anything important anytime soon. Or if I do, I think I'll have more important concerns...
Simple. I insert the key and push in the clutch (Score:2)
Three Locations, Two Countries (Score:2)
My Linux servers back up automatically with a homebrew Time Machine-like functionality based on rsync. It consists of a script and a configuration file with information of what files and directories to backup/not backup. The structure is fairly simple and has worked well for a very long time now.
The mail servers back u
With rsync and faubackup (Score:2)
time machine.... (Score:2)
Two copies (Score:2)
#1: rsync -aihxv --inplace --delete root@some-host:/ /backups/@some-host /backups/@some-host /backups/@some-host-`date %F--%T` /backups/@some-host-(date and time from some cycles ago)
btrfs subvolume snapshot
btrfs subvolume delete
(repeat for all hosts)
#2: Crashplan.com
rdiff-backup (Score:3)
I'm surprised rdiff-backup [nongnu.org] hasn't been mentioned yet. It's a very nice piece of software, does incremental backups, and is easy to automate.
New software: Obnam (Score:3)
I'm very excited (about backup software?) about this new backup program from an old buddy of Linus Torvalds':
http://liw.fi/obnam/ [liw.fi]
It seems like it will be the most featureful, forward-thinking backup software, ever: deduplication across multiple clients, compression, and strong encryption for untrusted storage. He's very keen on unit tests, too, and it has good verify and restore functions. I'm already using it for some things. GPL, of course, so no proprietary lock-in, ever.
Windows Home Server 2011 (Score:3)
Re: (Score:2)
Re: (Score:2)
So my question is what limit has WinZip for splitting large files over many disks?
RDiff local, fireproof lockbox in other building (Score:2)
I do rdiff-backup of plain files and cp --sparse=always of iSCSI shares and VM images to internal SATA drives in an eSATA cradle. Those drives are stored in a fire and waterproof lockbox in our detached workshop. Given our high ground location (flooding is very unlikely, so is landslides, no underground mines and low sinkhole probability ), anything bad enough to destroy the computers in the house and render the contents of the lockbox unuseable probably is bad enough that I don't care about the data.
At wor
Re: (Score:2)
I quit using rdiff-backup when I discovered that all my backups had been failing for a while because rdiff-backup did not like a filename (I think it was very long) and, instead of skipping the file and continuing (or even handling the odd filename), it had been hanging at that point. I consider that a giant FAIL for a backup system.
rdiff-backup wasn't doing anything that I could not do with rsync and some scripting anyway, so, what's the point?
Re:ZFS (Score:4, Informative)
Indeed it is not as efficient as it could be. However, using it is only slightly more complicated than "buy a usb hard drive and plug into computer"
An efficient, totally ideal process that no one actually bothers to use because it's either too complicated, or because it isn't actually licensed for your platform or whatever, is no backup system at all.
Also, ZFS is a filesystem that can be set up to preserve version information. It's not a backup while it's on the same disk....
Re:ZFS (Score:4, Interesting)
Re: (Score:2)
Another vote for Crashplan. I have the family plan. It's Linux friendly. There are no data limits. It's fairly cheap for what it is (I think that I can backup all of the computers in my "family" for $200 for two years).
But it isn't enough.
My semi-paranoid backup plan (and yes, I know that RAID is not backup):