Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Software Data Storage IT Technology

Ask Slashdot: It's World Backup Day; How Do You Back Up? 304

MrSeb writes "Today is World Backup Day! The premise is that you back up your computers on March 31, so that you're not an April Fool if your hard drive crashes tomorrow. How do Slashdot users back up? RAID? Multiple RAIDs? If you're in LA, on a fault line, do you keep a redundant copy of your data in another geographic region?"
This discussion has been archived. No new comments can be posted.

Ask Slashdot: It's World Backup Day; How Do You Back Up?

Comments Filter:
  • by M1FCJ ( 586251 ) on Saturday March 31, 2012 @11:46AM (#39534405) Homepage

    Simple. Redundancy backup.

    • by Anonymous Coward on Saturday March 31, 2012 @11:50AM (#39534443)

      Who needs RAID? My hard drive is so large that I just backup my files in a different directory :P

      (Note: this is a joke, but sadly, many actually follow this "strategy")

      • by AliasMarlowe ( 1042386 ) on Saturday March 31, 2012 @12:10PM (#39534583) Journal
        All of our important files (even the kids' files) are on the server [synology.com]. It backs itself up automatically 3 times per week to external USB drives. I rotate the USB backup drives every few weeks. So we need do nothing special today, as the backup works fine.
      • As long as it is not the only strategy, it's actually a good one: It's easy enough to perform it very frequently, protects you against most user mistakes (accidentally overwriting an important file, for example), and allows quick access to the backup.

        • by TheLink ( 130905 )
          Yeah and it's better than nothing - quite often the conventional hard drives don't fail catastrophically. So the backups on the same drive might still be accessible. Don't count on this though.
      • by allo ( 1728082 )

        for "oops i deleted the file" or "oops, i replaced some paragraph of text with a dumber version when i was drunk" this is quite useful backup. Easy to do, easy to recover. Only hard to remember and not automated. But this can be automated as well ...

        but then its going to be something more advanced, as rsync ./ /backups/$(date +%Y-%m-%d)/ anyway. so you got from stupid backup solution to inventing something good.

        of course it still does not cover drive failure.

    • by Pieroxy ( 222434 )

      Simple. Redundancy backup.

      Agreed a 100%. Here is my backup setup:

      First, my computer setup:
      1. I have a home server that doubles as a NAS on the basement. It has a RAID-5 array for redundancy.
      2. I have various other computers, including my own, my SO's and one server at a hosting company.
      3. Documents usually sit on my NAS.
      4. Documents for which I need an offline access (of sharing with my SO) sit in a Dropbox.
      5. Dropbox is also installed on my home server.

      My backup plan:
      1. My wife and myself have a "button" on our laptops to run the b

      • That sounds really complicated.

        I just put my car in reverse and look over my shoulder to make sure I don't hit anything.

      • by IANAAC ( 692242 )
        Sounds like you like playing the role of sysadmin.

        For what it's worth, Crashplan can handle a good deal of what you want to do without any hassle. Won't cost you anything either, other than the download/install time.

    • Comment removed (Score:5, Insightful)

      by account_deleted ( 4530225 ) on Saturday March 31, 2012 @04:24PM (#39536279)
      Comment removed based on user account deletion
  • by Anonymous Coward on Saturday March 31, 2012 @11:46AM (#39534409)

    It's a raid.

  • The backup gem( https://github.com/meskyanichi/backup [github.com] ) + an dedicated server + some cron processes.

  • Time Machine (Score:5, Informative)

    by anethema ( 99553 ) on Saturday March 31, 2012 @11:48AM (#39534423) Homepage
    Apple hate aside, time machine is an amazingly excellent backup system.

    It backs up to a Netgear Readynas configured in RAID 5. Hourly, daily, weekly backups. I've never lost anything thanks to this great system.

    In linux I try to approximate this with BackupPC.

    http://backuppc.sourceforge.net/

    It is really an excellent piece of software, though no where near as refined of course. You pretty much only get daily backups though since the kernel in linux does not track filesystem changes so hourly backups would be very prohibitive.
    • Comment removed (Score:5, Informative)

      by account_deleted ( 4530225 ) on Saturday March 31, 2012 @11:52AM (#39534453)
      Comment removed based on user account deletion
    • Re:Time Machine (Score:5, Informative)

      by jo_ham ( 604554 ) <joham999 AT gmail DOT com> on Saturday March 31, 2012 @12:11PM (#39534589)

      I'll second this. I use Time Machine too. I don't have any fancy NAS box for it (due to budget mostly) - I just use an external firewire disk right now, and it has been used once due to a full internal drive failure (restoring the iMac back to the state it was in an hour before the failure) as well as the occasional single file recovery.

      Most back up systems work well for full system recovery - Time Machine is not unique there - but it's the single file recovery tool that really makes it shine. It's very simple and intuitive to use.

      It is totally "hands off" though - you have to trust that it actually is doing what you tell it to, beyond the menu item that gives you a summary of what it's up to (total being backed up at that moment, last backup time etc). It doesn't have a "show me a list of files backed up at x time" feature without the use of third party tools, so people who really want peace of mind may find that annoying.

      • by anethema ( 99553 )
        I actually find full system recovery another place where TM kicks ass.

        Put in your disk, wipe the drive make a new partition, and install by 'restoring from time machine'

        It will make you system how it was from the minute that backup was made, almost as if nothing changed.

        As mentioned, the tmutil command line utility is great for finding out exactly what is in each backup, what changed(added and removed), what kind of size diff there is between two backups, etc.

        You can also incorperate an old sparsebundle int
    • I've never lost anything thanks to this great system.

      The real question is have you ever actually needed it?

      I can say "I never drove my car into a volcano thanks to my awesome GPS!" but if I was never in danger of driving my car into lava, it would be a pretty pointless statement.

      • by smash ( 1351 )

        Time machine works fine. I've migrated machines with it, re-installed and restored from it, and retrieved old copies of files from months ago from it.

        Time machine is one of THE features of OS X that linux peeps should be copying the shit out of, rather than trying to do some cheap rip off of Aqua as the next UI flavour of the month.

    • Windows has had Volume Shadow Copy since XP which does the same kind of thing, it just does not have a flashy interface.

      • by smash ( 1351 )
        Except it kinda doesn't work anywhere near as well, and doesn't automatically snapshot to a different location.
    • by goombah99 ( 560566 ) on Saturday March 31, 2012 @12:21PM (#39534671)

      Backup is only half the problem. Restore is the other half. And indeed that's where I've usually had the most problems. The third problem is validating the restore. You always worry that you are either going to overwrite something on the restore target or miss something on the restore source and end up in an inconsistent state.

      Time machine is revolutionary because it is so simple and seems to be almost flawless. I've had lots of backup systems over the years including dump 0 but everyone has been plagued with issues that arose when things were off normal. I've cobbled all sorts of things like rsync and cpio but the only thing that comes close to working as flawlessly as time machine is a NetApp.

      At work where I can control the remote servers securley on a closed network I am able to use time machine for a remote backup. But at home I don't have a remote server I can target for the remote backup.

      TO do a remote bakcup at home I use Crashplan. I looked a lot of competitors like Mosy but settled on crashplan for two killer reasons. The giant problem with all these commercial backups is that while the incremental backups are simple over the net, the restore of a whole hard disk cannot be done over the net. You have to pay them to burn DVDs and send them to you. ANd that assumes you know what time period you want to recover.

        UNlike all the other methods crashplan lets you pick a buddy who runs crash plan and then you can back up your disks to each others computer. If you need to to a massive restore you just drive over to your buddy's house and pick up the drive, bring it home, and restore locally. This also solves the problem of the first dump being too large to send over the net as well. You do it locally then drop the drive off to your buddy.

      Brilliant!! plus with crash plan you pay for the app once not monthly.

      I've used it for years now and it works very well and it very easy to set up. All your files are encrypted so buddies can't read each other's drives.

      The only flaw with crashplan is that it runs in java so you have this instance of java running 24/7 and not to put to fine a point on it: java sucks. I don't know if it is crashplan or other things that run in the JAVA VM but over the week it bloats up to 600MB to 800MB. THe workaround solution is to kill the java VM every few days. Empirically crashplan is robust enough to survive this and restart. But that's a really awful solution.

    • Re:Time Machine (Score:5, Informative)

      by digitallife ( 805599 ) on Saturday March 31, 2012 @12:26PM (#39534717)

      This.
      I have (and still do) use all sorts of various systems for backups both at home and at work, and Time Machine is by far the best. Completely invisible, automatic and smart. You can turn off your computer mid-backup and it just continues when you turn it back on. It is so much better than the alternatives, I'm surprised how little limelight it gets.

      Perhaps just as important as the backups: it has a great UI to access said backups! One click gives access to a file at any date in the past you want.

    • Re:Time Machine (Score:4, Informative)

      by japa ( 28571 ) on Saturday March 31, 2012 @01:03PM (#39534985)
      Back In Time [le-web.org] is a simple backup tool for Linux inspired from “flyback project” and “TimeVault”. The backup is done by taking snapshots of a specified set of directories.

      I use it with external USB drive and it has saved my butt couple of times. Cases where I thought the focus is in certain nautilus window, then doing Shift-delete + enter in very quick fashion and fraction of a second later realizing there was another nautilus window with focus on some directory which is now nuked... As this is just a frontend to rsync and uses hard links, there is the advantage of the backed up files being available even without the backup program as normal files within the directory structure on the backup media.

  • by mooingyak ( 720677 ) on Saturday March 31, 2012 @11:48AM (#39534425)

    With a loud beeping noise.

  • Comment removed (Score:5, Insightful)

    by account_deleted ( 4530225 ) on Saturday March 31, 2012 @11:49AM (#39534433)
    Comment removed based on user account deletion
  • by Alan Shutko ( 5101 ) on Saturday March 31, 2012 @11:50AM (#39534445) Homepage

    I currently sync my files across three computers, each of which does a time machine backup. The files are also backed up via Jungledisk to Amazon S3. Occasionally I do full-disk images of things.

    Files that would be inconvenient to lose, but which are not irreplaceable, are stored on a Drobo (redundant drive enclosure). This includes, for instance, my music library which could be reripped from CD.

  • by Electricity Likes Me ( 1098643 ) on Saturday March 31, 2012 @11:52AM (#39534455)

    Between 3 active computers I use, there's enough redundancy since they're rarely in the same place. SpiderOak manages absolutely, completely vital stuff (currently my thesis drafts).

    But there's no real, constructive and useful pattern to it yet. The problem is less backups and more change management. Keeping copy-on-write sane on Windows is difficult, and migrating my servers XFS partition to ZFS is problematic since I need just tons of storage to do it which I presently can't afford.

    The issue is far less "backups" and more "making them meaningful". Backing up is useless if I overwrite the media with the important changes, or it takes forever to dissect a working copy of the data.

  • cron job (Score:5, Funny)

    by ILongForDarkness ( 1134931 ) on Saturday March 31, 2012 @11:53AM (#39534459)

    My weekly backups: something like:

    0 0 * * 0 /home/me/backup.sh

    #### backup.sh #####
    cp -r home/me/* /dev/null

    I haven't missed a backup yet :-)

    • Goes pretty quick too...

      I use (software) RAID-1 for my /home directory, just reinstall the OS and apps. If my raid totally dies I guess I'm mostly screwed, but truly important stuff like tax returns I keep a copy on a different box (also w/ RAID /home) and a paper copy in a fire resistant safe.

      • I use (software) RAID-1 for my /home directory, just reinstall the OS and apps

        What do you need RAID for at home? Would it matter if your home computer was down for a few hours?

        RAID is not for backups, it provides continuous operation if a hard drive goes down. I really doubt that many many home computers need this.

        Since you already have 2 hard drives, a much better use for the second is to remove it from the RAID set, put it in an external enclosure and use it for backups. In an external enclosure,

    • Re:cron job (Score:4, Funny)

      by AliasMarlowe ( 1042386 ) on Saturday March 31, 2012 @12:27PM (#39534723) Journal

      My weekly backups: something like:
      0 0 * * 0 /home/me/backup.sh

      #### backup.sh #####
      cp -r home/me/* /dev/null

      You should make a restore.sh script to match this. Then test it...

      • Haha. I worked on a cluster that pxe booted off the head node once. Would have been great to add it to the pxe script: 40 servers all installing linux and cluster software then /dev/null it all :-). I don't understand the systems go down for ~20min come up and then crash :-)

  • For our personal computers, we use Time Machine - but manually triggered.

    For the media server, I've got a second disk and a once-a-day cron job.

    Offsite... well, okay we're not really there yet. So we're covered in case of hard drive failure, but not a catastrophic fire.

    (I am assuming this question is about home, not about work)

  • by Ken_g6 ( 775014 ) on Saturday March 31, 2012 @12:01PM (#39534509) Homepage

    I basically use this shell script once a week:


    drive=/backup/drive
    bpaths=/some/paths /to/backup
    for d in $bpaths ; do
            dout=`echo $d|sed -e "s/^.*\///"`
            echo Backing up $d as $dout
            ionice -c3 rm -f "$drive/bkup/$dout.*z"
            ionice -c3 tar -c "$d" | gzip -c | ionice -c3 openssl aes-256-cbc -salt -out "$drive/bkup/$dout.tgz.aes" -pass pass:"WouldntYouLikeToKnow"
    done

    I then copy the data to my USB drive on my keychain if it's plugged in. (Hence the encryption.) I also have a scheduled task on my laptop to copy the data from my desktop the next day.

  • My house is full of Macs, so I use Time Machine for on-site backup - each machine has its own Time Machine drive dedicated to it. Each machine also runs nightly image backups using SuperDuper onto yet other drives dedicated to that purpose.

    All info is also backed up offsite. I use CrashPlan Pro, which backs up over the net to their servers somewhere in the American Midwest (Milwaukee?) - in the event of a fire or a giant sinkhole opening up under my house, I can get the full contents of all my computers shi

  • With two copies for redundancy. Don't use cloud solutions because I don't trust them. Well, unless you count mailing shit to myself and using Gmail as a cloud backup solution, anyway.
  • Encrypt any important data that you don't want to lose, and keep it close to crazy terrorist rants in plain text files. With all the government snooping going on, in the interests of our security, the government will secretly make backup copies of it for you. Their experts will try to decrypt it, and be baffled by the message hidden in your wedding video.

    In case of a disk crash, just ask the government politely to give you a backup copy of your data. They will kindly oblige.

    Probably.

  • by captain_dope_pants ( 842414 ) on Saturday March 31, 2012 @12:04PM (#39534529)
    Live on the edge guys...

    When you boot up in the morning and it takes a little longer than usual, the heart beats a little faster and you think "OMG is the machine going to fail? My data will be gone". Or perhaps there's an electrical storm to liven your day up - "If that thunder gets any closer I might have to shut down the PC, but if lightning hits then everything's toast !".

    These scenarios, and many others, all get the blood pumping in fear. If the computer /does/ boot or you /don't/ get toasted by bolts of electricity then the sense of relief is wonderful !

    Try it - it's fun ;-)
  • I use a combination of BackupPC and CrashPlan+ Family Unlimited to keep the data on all my systems safe. Works like a charm.

  • Tahoe LAFS (Score:5, Interesting)

    by swillden ( 191260 ) <shawn-ds@willden.org> on Saturday March 31, 2012 @12:07PM (#39534551) Journal

    I use a secure distributed grid. The software is an open source tool, Tahoe LAFS (http://tahoe-lafs.org [tahoe-lafs.org]). The grid is composed of ~15 servers contributed by different people all over the world. There are a half dozen servers in various locations in the US, about the same number in Europe, and the remainder in Russia and the Ukraine.

    My files are AES256-encrypted on my machine, split into 13 pieces using Solomon-Reed coding, any five of which are sufficient to reconstruct my files, and then those 13 pieces are distributed to the servers in the grid. I run daily backups, but since uploads to the grid are idempotent, only the changed or new files are stored. I also run a bi-weekly "repair" operation which checks all of my files (all versions, from all backup runs) to see if any of their pieces are lost. If so, it reconstructs the missing pieces and deploys them to servers in the grid. The individual servers in the grid are fairly reliable, but problems do happen, so repair is important.

    I get about 100 KBps net upload rate, so this isn't a good solution for backing up terabytes, and the occasional "surge" in my data generation (usually caused by a day of heavy photo-taking) often causes my "daily" backup to take a few days to run, but all in all it works very well.

    Should my server ever die, I only need two pieces of information to get all of my data back: The grid "introducer" URL, which will allow me to set up a new node connected to the grid, and my root "dircap", which is a ~100-byte string containing the identifier and decryption key for the root directory of my archive. That directory contains the decryption keys for the files and directories it references.

    Since this grid is all volunteer-based, the only cost to me for this backup solution is the hardware and bandwidth I provide to my grid (I provide 1 TB of disk and grid usage consumes a fairly small fraction of my Comcast connection), plus the time I spend administering my server and checking to see that my backup and repair processes are running. Oh, and I also contribute (a little) to the Tahoe LAFS project, but that's due to interest, not a requirement.

    I'm very, very happy with this solution.

    BTW, the grid could use another 20 nodes or so, if anyone is interested. There's a fair amount of trust required of new members to the grid, though, so it might take us a while to vet new members. The trust is required not because other members of the grid might have access to files that are not their own, but we need to verify that new members will behave appropriately -- providing their fair share of storage and bandwidth, and not consuming too much.

    Anyone interested should check out the grid's policies and philosophy at: http://bigpig.org/twiki/bin/view/Main/WebHome [bigpig.org]. If all of that looks good, join the mailing list, introduce yourself and we'll consider allowing you to join the grid.

  • I am far more interested in World IPv6 day. We need to get moving over to that.
  • Each of my computers has a dedicated external hard drive where I push backups on a regular basis. I have one external hard drive stashed away at my parents where I make an backup of my laptop every time I visit them. My laptop pretty much has all of my important data so it serves its purpose. All the hard drives involved in this are encrypted of course.

    On top of that my most important text documents(not necessarily important in the way of having personal information, but a lot of work put into) are backed
  • by tomhath ( 637240 ) on Saturday March 31, 2012 @12:08PM (#39534569)

    Think through what you're backing up and why. For most people a thumbdrive should be sufficient for personal data; software can be reinstalled as needed. If you have more data than will fit on a thumbdrive you need to look at what's important.

    Really large volumes of data almost always are static; usually music, eBooks, or video which can just be backed up once on a DVD and put away. No need to keep copying that stuff over and over.

    Backing up software projects is another issue. A remote versioning site is best. Working in Java you'll need all the space you can afford; for a language like Python an old floppy drive is sufficient.

  • I use just a three-level hierarchy:

    1. Photos and documents are on my RAID-5 array (4 × 1 TB Hitachi enterprise drives) in my desktop, backed up occasionally (every month or so) to a Toshiba 1 TB eSATA external drive sitting on my desk

    2. Music, movies, TV shows, are on the RAID-5 array, not backed up

    3. Windows and programs are on my 80 GB SSD, not backed up.

    So I'm not protected at all against my house burning down, but this has worked for me for the past 10 years. (For my old system, which ran 2003

  • I really like Areca Backup. It has a fairly straightforward GUI and you can easily back up groups of files to different backup locations or media. If you run a differential or incremental backup, the GUI presents a "logical view" of that backup against the last full backup or series of backups. Now, if only I could find some easy way to tag and organize 20,000 mp3s...
  • More important (Score:3, Insightful)

    by elrick_the_brave ( 160509 ) on Saturday March 31, 2012 @12:13PM (#39534615)

    World backup day? How about world test your restore day? All the backups in the world don't mean anything unless you test your restores and know your data.

    • So, how does a home user test the restore? Seems like chicken-and-egg. To test the restore I need to wipe the drive. To wipe the drive I better have a working backup. To know I've got a working backup I need to... test the restore?
      • by TheLink ( 130905 )
        No, all you have to do is buy another drive and restore it on to that drive.

        If your time and data is valuable enough, the cost of another drive is not a big deal even post thailand flood.
  • The Tao of Backup (Score:2, Informative)

    by Anonymous Coward

    Killthre... I mean The Tao of Backup [taobackup.com]

  • Mon, wed Fri: Ghost Backup of OS and C: drive to a separate drive
    Tue, Thu: filecopy of all required files (Pics, mp3s, important data to same separate drive
    Daily scheduled task that runs and will mirror said separate drive to another drive that is mounted as Y:. This drive is stored at work and is fully encrypted using truecrypt. Usually bring it home on Mondays but if I forget, it will automatically run any other day

  • rsnapshot (Score:5, Informative)

    by Wagoo ( 260866 ) <wagooNO@SPAMdal.net> on Saturday March 31, 2012 @12:19PM (#39534661)

    rsnapshot [rsnapshot.org] seems to work pretty well for incremental rsync'd backups for me. It uses symlinks to maintain the older snapshots, to save on total filesystem usage. It can do rsync over ssh for backing up remote servers/pushing local vital data to a safe remote location.

    Local backup server uses Linux software RAID for good measure (5x1TB RAID 5 + 10x2TB RAID 6).

  • It's also the day of Earth Hour - a day on which, for one hour, people around the world turn off the lights.
    http://en.wikipedia.org/wiki/Earth_Hour [wikipedia.org]
    I wonder how many here personally partake.

    -----

    As for backing up goes, I'll just re-paste here what I said in a recent other Ask Slashdot question:

    It's not really 'managing' my data.. it's a storage/backup solution. The difference is that if I 'managed' my data, I wouldn't have tens of thousands of digital camera photos in a bunch of folders with meaningless name

  • My solution is mirroring data from the computer to a NAS with RAID, then a harddrive I take offsite with all the impossible/hard to replace data.

    Ironically a Harddrive in my NAS died two days ago, so I have had to do a rebuild and as a result bought a new external hard drive that is big enough for all the data, even the easy to replace data. So currently my NAS, which rebuilt the array successfully is copying all of its data to a new external harddrive.
  • Chinese espionage hackers do it all for us free. They copy our stuff over to their side. It's as off-site as you can get.

  • For laptops, I use a scheduled rsync to a central server mounted using sshfs. For offsite, an rsync to an EncFS filesystem on a portable drive. If bandwidth limits ever get reasonable, I'll switch to using DropBox or SpiderOak, but the bandwidth limitations remove that as a solution for all but important data.

  • by D'Arque Bishop ( 84624 ) on Saturday March 31, 2012 @12:28PM (#39534729) Homepage

    I have two systems I use.

    For my servers, I use AMANDA [amanda.org] with encrypted virtual tapes to do nightly backups. Shortly after the backups run, cron calls a shell script in order to copy the virtual tapes to an offsite location via rsync.

    For my desktop PC, I don't need to back up as often, so I do a weekly backup via Windows Backup to a TrueCrypt volume on an external hard drive. When it's not being used to back up my PC, I keep the external hard drive at my office. I figure if something happens where both my office and home are destroyed, then at that moment I've got bigger problems to worry about than my data. :-)

    Just my $.02...

  • I just wanted to throw out a decent rotation system I used for making backups when I had some pretty important data to keep track of.. it works with 11 sets and goes like this.. you get 11 sets of whatever..tapes, drives, thumbs..your media... you label them mon, tue, wed, thu, fri1, fri2, fri3, month1, month2, month3, month4 (we didnt do weekends.) The rotation is pretty obvious..and allows you to go back any day this week, any friday of the month and any month for 4 months. We did this so we could roll b
  • very little is worth backing up. That which is gets uploaded.

  • I've been using rsync to back up 1/2 dozen machines here for some time now. Great for both local and remote (internet) backups. My mom's imac is 300 miles away. She has a little 500gb firewire drive locally backing up with time machine, which gives her instant recovery as well as application-spicific recalls (get that email back or that address book card back easily for example) as well as versions. Then a custom made cronjob to rsync to my server here runs nightly for offsite in case of fire/theft/etc.

  • Seriously, who cares about what happened 5 years ago?

    When did you last look at that porn clip you downloaded in 2002? How many emails do you really need to keep for more than a month? Would Battlezone 2 even run anymore if you tried it?

    When did you last get audited by the IRS? Screw that, my accountant should have copies anyway, and most info was electronically reported anyway.

    Between the cloud, my iPhone, 4 synced home boxes, a drop box for some important stuff, Google, I can recover anything non-trivial.

  • I back up by putting my car in reverse. Or by walking backwards. Duh.
  • Time Machine to a 4 TB drive on my main computer. For my wife's machine and my laptops, Time Machine to a 2TB Time Capsule. Haters can hate, it just works.

  • Rsync to a local Drobo with Drobo RAID, a Local fiber channel array with RAID 5, and to a remote NAS with RAID 5, every night.

  • It's like the Linux version of Flyback or Timevault.

    http://backintime.le-web.org/ [le-web.org]

  • For a single computer only, it can be fairly inexpensive with 2-3 TB HDs.

    1) Clone the entire working drive once every one - two weeks, so I can go back to a working OS if the OS is corrupted as that takes a day to reinstall all OS, Apps & Utilities and migrate data back. Hence, this is a recovery point for the entire HD on the time period one picks.

    With cheap 2-3 TB hard drives, one could actually clone the HD every day for a lot of people & then overwrite such that you had a complete 2 weeks of cl

  • Earlier I used backup2l to first make local backup and then rsync to server. The only problem was it was wasting disk space on each host, specially laptops.

    Recently I moved to bup [github.com], provides more efficient backups with very small local storage. Now I have in every laptop, desktop and my email server (all running either Debian or Ubuntu) in /etc/cron.daily/bup-backup:

    #!/bin/sh
    echo Backup starting at $(hostname) $(date)
    bup index -u /var/mail /home /var/lib/mysql
    bup save -r backups.example.com: -n $(hostn

    • by puhuri ( 701880 )

      And of course forgot configuration at server end; in /home/bupups/.ssh/authorized_keys there is a line for each host:

      command="BUP_DEBUG=0 BUP_FORCE_TTY=3 bup server",no-port-forwarding,no-agent-forwarding,no-X11-forwarding,no-pty ssh-rsa AAAAB3NzaC...

  • The machines needing backup do so wirelessly to a Time Capsule whenver they're connected. I also do a full image to an external hard drive every once in a while (I lie to myself and say it's biweekly but it's really more like bi-whenever-I-remember-it). Even with the recent increase in HD prices there's no real excuse to not have backups.
  • by c_oflynn ( 649487 ) on Saturday March 31, 2012 @12:50PM (#39534897)

    My backup for a multi-boot laptop that other solutions (e.g.: running from one OS) don't seem to work for:

    1) Buy a second copy of your main hard-drive + USB Interface (SATA enclosure)
    2) Boot Linux on computer using CD
    3) Use dd to mirror entire HD to external HD. Run before you go to bed, setup to shutdown when done. Save stdout/stderr somewhere like a USB flash drive.
    4) Wakeup to a backup.

    The advantage of this is when your hard drive fails, recovery is about 60 seconds away. Swap out one hard drive and you are done. Or you can recover specific files by just using the backup HD like a normal external HD, since everything is just under normal filesystems. If you'll be on business for a while take your second hard-drive with you (try to store somewhere it won't get stolen with laptop).

    I actually keep two mirrors, partially because of travel and wanting to have one backup with me. This also makes sure that if your computer fails half-way through doing the mirror due to a power surge it doesn't fry your original + mirror. Keep one at a friends house or similar.

  • My Ubuntu 12.04 server is btrfs using the built-in RAID 10 features (I have 7.2 TB usable on four 2TB drives thanks to compression too!) and I use time machine for the macs. My windows machine is Pro so I'm able to backup to a network drive, with a samba share running on the ubuntu server that I mount as a drive letter in windows. It's been fantastic overall for all machines including for the odd restore.

    I'd like to get time machine going to the ubuntu server as well but so far I'm just using external us
  • The premise is that you back up your computers on March 31, so that you're not an April Fool if your hard drive crashes tomorrow.

    Thanks for reminding me to stay away tomorrow.

  • by godrik ( 1287354 )

    Everything worth being backed up I am working on is pushed to a git repository. The git repositories are synchronized manually (but pretty much after each important update) across about 5 machines in 3 physical locations (home, work, computing center). Though they are in the same city.

    I don't think I will lose anything important anytime soon. Or if I do, I think I'll have more important concerns...

  • Then I turn the key and crank the engine. Since the transmission is already in reverse, I simply apply gas, and suddenly release the clutch. That said, I try to avoid back ups as much as possible, because the hard drive seems to cause crashes more often than not. (Mod me+1 Flamebait)
  • I have some years ago emigrated and live abroad, which has given me an opportunity to use servers in three different locations in two countries. At home, at my brother's and at a friend's.

    My Linux servers back up automatically with a homebrew Time Machine-like functionality based on rsync. It consists of a script and a configuration file with information of what files and directories to backup/not backup. The structure is fairly simple and has worked well for a very long time now.

    The mail servers back u
  • For years now, I've been making all of my backups with a script I wrote that uses rsync and faubackup, the latter being a disk-based backup solution. All important data is backed up on a daily basis, locally and to other servers across the Internet. All partitions involved (source and target) are on RAID1 or RAID5 arrays (by itself, RAID is not a backup solution, but does increase the reliability of the storage medium). The only way I've ever lost any data with this system is when I forget to add a source t
  • ... plus raid.
  • #1: rsync -aihxv --inplace --delete root@some-host:/ /backups/@some-host
    btrfs subvolume snapshot /backups/@some-host /backups/@some-host-`date %F--%T`
    btrfs subvolume delete /backups/@some-host-(date and time from some cycles ago)

    (repeat for all hosts)

    #2: Crashplan.com

  • by Balinares ( 316703 ) on Saturday March 31, 2012 @02:36PM (#39535633)

    I'm surprised rdiff-backup [nongnu.org] hasn't been mentioned yet. It's a very nice piece of software, does incremental backups, and is easy to automate.

  • by gottabeme ( 590848 ) on Saturday March 31, 2012 @04:06PM (#39536171)

    I'm very excited (about backup software?) about this new backup program from an old buddy of Linus Torvalds':

    http://liw.fi/obnam/ [liw.fi]

    It seems like it will be the most featureful, forward-thinking backup software, ever: deduplication across multiple clients, compression, and strong encryption for untrusted storage. He's very keen on unit tests, too, and it has good verify and restore functions. I'm already using it for some things. GPL, of course, so no proprietary lock-in, ever.

  • by FirstTimeCaller ( 521493 ) on Saturday March 31, 2012 @05:36PM (#39536725)
    WHS works a lot like Time Machine (I suppose). All our machines are backed up automatically every day to a server (in another building). And most importantly, the restore works and is surprisingly fast! I've had two machines go belly up and didn't loose any data.

"What man has done, man can aspire to do." -- Jerry Pournelle, about space flight

Working...