Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Cloud Security

Ask Slashdot: What Do You Use for Backups at Home? 283

"I am curious as to what other Slashdotters use for backing up of home machines," asks long-time Slashdot reader serviscope_minor: I moved away from the "bunch of disks with some off site" method. I found most of the methods generally had one or more of the following problems: poor Linux support, weak security (e.g. leaking file names), outrageously expensive, hard to set up, tied to a single storage supplier I don't fully trust, entirely proprietary (which makes me doubt long term stability), lack of file history, reputation for slowness, and so on.

My current solution is Unixy: separate tools for separate jobs. Borg for backups to a local machine. Rclone for uploading to business cloud storage, versioned cloud storage to provide resistance against bitrot and other corruption.

They're interested in "what other Slashdotters use," as well as "why and what your experience has been given more than superficial testing." So share you own thoughts in the comments.

What do you use for backups at home?
This discussion has been archived. No new comments can be posted.

Ask Slashdot: What Do You Use for Backups at Home?

Comments Filter:
  • by Joce640k ( 829181 ) on Monday March 08, 2021 @07:37AM (#61135506) Homepage

    On Linux you can use "dd", "gzip" and the | and > operators.

    • by LordHighExecutioner ( 4245243 ) on Monday March 08, 2021 @08:42AM (#61135704)
      dd is very effective for backups. Since when my personal data exceeded the size of 1 TB, I started backing up all of them with the command:
      dd if=/dev/sda of=/dev/null
      It is blazing fast, and I stopped wasting money buying new backup hardware.
    • On Linux you can use "dd", "gzip" and the | and > operators.

      I used to do that! More specifically those plus a tools whose name I can't remember now (there was some FOSS drama about it yonks ago) to dump my hard drives straight to CD then later DVD.

      I specifically used it to backup and restore my Windows 95 gaming drive: it shat itself often and restoring from a CD took an hour unattended vs a day's worth of reinstalls.

    • That is old school. However, there are replacements. gzip has been replaced by "xz -v9e" which gives markedly better compression.

      I know that is the old fashioned way, but I've given into modern luxuries:

      * Pop ZFS snapshots.
      * Mount the snapshots read-only on a temporary filesystem.
      * Have Borg Backup slurp the mounted snapshots and throw them via SSH to another machine that has a repository. (I use SSH to ensure Borg's append only function is enforced, just in case of ransomware.)

      Of course, zfs send and

  • Back in Time (Score:4, Informative)

    by davecb ( 6526 ) <davecb@spamcop.net> on Monday March 08, 2021 @07:40AM (#61135510) Homepage Journal
    Built on top of rsync, of course
    • Re:Back in Time (Score:5, Interesting)

      by v1 ( 525388 ) on Monday March 08, 2021 @08:38AM (#61135692) Homepage Journal

      My primary laptop has a Time Machine drive attached to it (which uses Rsync under the hood), I replace the drive every 2 years or so, removed goes into cold storage as long-term incremental backups.

      I also have a sophisticated custom bash script running rsync to an onsite backup server. That server also backs up its own boot drive, as well as several other computers in the building, in addition to all the server volumes.

      The backup server also uses rsync over the internet to back up a few other computers. Last I counted, I had in excess of 25 TB total online storage under the roof.

      So far I've needed to dig into time machine on a few occasions for something that got changed or corrupted on my laptop, and I've had to restore a 23gb photo library container to a computer 300 miles away due to operator error. (at the time, that took almost 3 days to copy back)

      I have yet to need to do a full system restore. And I still need to improve my offsite storage.

      In all of history I've lost two files. The first one almost two decades ago when a server drive thoroughly trashed its directory AND the backup drive somehow completely wiped its partition table. I had to rebuild the partition table by hand to restore it, lost one file. The other time was before I had my laptop backing up properly and had no backup to restore when I RM'd a link using the wrong syntax, leaving the link and deleting the original. (d'oh!) That was an amazing picture of a wild rose on the bike path, which I paid a lot of blood to the mosquitoes to get, I was upset about it and it's the reason I have such good backups now.

      I'm willing to wager that 80% of the good backups in use today are due to people losing something important and not wanting to lose something again.

      • Re: (Score:3, Insightful)

        by detritus. ( 46421 )

        > I have yet to need to do a full system restore. And I still need to improve my offsite storage.

        In other words, you've never fully tested your backups.

        • by v1 ( 525388 )

          In other words, you've never fully tested your backups.

          You can't fully test your disaster recovery systems until you have an actual disaster occur. Even the best of simulations will be incomplete tests. And things you take for granted "could never fail" will occasionally fail.

          Case in point: At work we have a big diesel generator for the network center, and a large dual UPS inside to cover switch-over. It gets tested weekly, automatically. We had a power failure. Generator turned on automatically and e

  • Rsync (Score:5, Insightful)

    by Artem S. Tashkinov ( 764309 ) on Monday March 08, 2021 @07:43AM (#61135522) Homepage

    1. rsync to a HDD which is stored in the desk drawer.

    2. rsync to a remote HDD (remember that offsite backups are crucial).

    • Seconded. rsync along with its powerful and little known options.
      • Seconded. rsync along with its powerful and little known options

        I did that for a while, when I was using disks. I rather liked the 3 way Rsync where you have the source, old destination and new destination and it will hardlink to the old destination if there was no change, so you get efficient snapshots.

    • Also this (well my main backup hard drive is in a server, is encrypted and normally unmounted), only thing that doesn't back up this way is my gaming PC, where I create a disk image to an external hard drive using the Win7 backup tool. My phone also backs up via rsync through a chroot, although it's kind of a messy system and restoring wouldn't be so straightforward.

    • Re:Rsync (Score:4, Interesting)

      by Bigbutt ( 65939 ) on Monday March 08, 2021 @09:33AM (#61135848) Homepage Journal

      Pretty much this. We live in a forested area so there’s fire danger.

      I have an offsite server, a local homelab with 90 tb of disk, and an external 14 tb drive.

      Desktop is rsync’d to a VM on the homelab regularly and to the 14 tb drive about once a week.

      The offsite server gets my websites and photo library so it’s not a full offsite backup of my desktop.

      The 14 tb drive also backs up the vms on the homelab. This includes my terraform and ansible scripts used to build servers.

      Basically in the event of fire, I top off the 14 tb drive, drop it into my go bag, grab the important papers, wife, and pets and bug out.

      [John]

    • by AmiMoJo ( 196126 )

      It would be nice to have a slightly more flexible and integrated system set up though, as well as multiple version support. I have found that anything which isn't entirely automatic tends to get neglected.

      By the way, there is a tool called rclone that does something similar to rsync but for cloud services. Handy for syncing off-site backup sets, although you should of course encrypt first. That brings up another issue with many solutions - the encryption prevents effective differential backups.

      • By the way, there is a tool called rclone that does something similar to rsync but for cloud services. Handy for syncing off-site backup sets, although you should of course encrypt first.

        I believe rclone offers encryption so you can make your unencrypted local backups encrypted on the cloud. Also various cloud services have their own encryption systems.

        That brings up another issue with many solutions - the encryption prevents effective differential backups.

        Borg can encrypt its database. You provide a key

        • by AmiMoJo ( 196126 )

          Thanks, I hadn't looked at encryption in rsync. I only used it to periodically backup my cloud services. I also use Thunderbird for that, having it automatically start in the middle of the night, sync IMAP mailbox with Gmail and then close.

          Borg seems quite good, similar to Duplicati but no Windows support. How robust is it? As I say occasionally Duplicati uploads fail (due to internet connection problems) and you have to run a manual command to recover, but it's never lost any data or completely trashed a b

  • by carlhaagen ( 1021273 ) on Monday March 08, 2021 @07:45AM (#61135530)
    A simple *sh script I wrote years ago that wraps up pre-defined paths, asks for a password, then spits out a compressed and encrypted tar-ball which is kept both locally and uploaded remotely to a separate location.
    • Much the same here, except that each path in the config can also specify LZMA compression levels. For example, documents and source code can be compressed greatly, but doing it for images and videos is mostly a waste of time.
  • Still tied to the concept of Home? That has a physical computer? And your data is tied to some storage in that machine? So quaint .... So last millennium. ...

    I dont back up anything anymore. Just ask Google. It knows everything.

    • And here I thought you all left your hard drives open to the P2P crowd and let them backup for you.

  • Duplicity (Score:4, Informative)

    by Vegard ( 11855 ) on Monday March 08, 2021 @07:51AM (#61135546)

    I ended up with Duplicity [nongnu.org]. It does locally encrypted difftars, versioning and relatively easy retention policy stuff so that I can keep i.e. two full backups and incremental on top of the last one.

    Backend? Just choose the one with best price/performance for you. I do the AWS S3 Service from Wasabi [wasabi.com], because it has so far served me well and is the cheapest S3-compatible service I have found so far. But Duplicity in itself is quite flexible, so you might find storage solutions that suit you better.

  • Duplicati (Score:5, Informative)

    by AmiMoJo ( 196126 ) on Monday March 08, 2021 @07:54AM (#61135550) Homepage Journal

    Duplicati is an open source, cross platform backup system that I use. It can store files on a local drive or a number of cloud services, as well as standard stuff like SFTP. I personally use Jottacloud because it's cheap and seems reliable.

    It fully encrypts backups so there is no leakage of filenames or other metadata. Does de-duplication and compression too. You can control the handling of old versions of files and it has some quite smart options for handling that.

    There is a bit of a learning curve and occasionally you need to use the CLI to sort out a failed backup, but most of the admin is done via a local web interface. You can use it 100% CLI if you like.

    For reference most failed backup issues can be fixed by simply deleting the last backup and doing another one, which is accomplished with the "delete" command with the version=0 parameter. Version 0 is always the most recent.

    • What's the restore experience like?

      Do you restore a whole snapshot or individual files? I've not tried a whole snapshot restore on Borg yet, I tend to go for individual files when I need them, because I inevitably backup a ton of junk.

      • by cruff ( 171569 )
        You can select a backup date and drill down into the contents to do selective restores.
      • by AmiMoJo ( 196126 )

        You can restore individual files or the whole lot and anywhere in-between. If you use the web interface you can select graphically. You can also restore old versions if they have been kept.

        It has a test mode as well where it just downloads and checks all the files, useful to run once in a while.

    • ah, yes, the VAX legacy;-) i remember those daze fondly:-) layered logicals, automatic versioning...VMS was noted as a developer's environment, so much that the commies copied it;-)

      i had just started on my 1st unix project: CCATS configuration control & tracking system (it did neither;-) and made the greatest of my many CLMs:

      the project manager was droning on in an orientation meeting, in a warm, dark room, on a friday, after lunch...i was just nodding off when i heard him say: the latest version is 0,

  • by boner ( 27505 ) on Monday March 08, 2021 @07:55AM (#61135552)

    I use ZFS for my user and data directories. Striped, two-way mirror each for the home (7.25T) and data (3.62T) volumes. These are backed-up continuously via Crashplan/Code42 with a small business plan. As a secondary measure I have a backup-server sitting in the garage to which I send a monthly snapshot. The backup-server retains all monthly snapshot, while my primary machine releases them.

    With Crashplan I can go back several months and years, depending on the latest change of layout. It is there primarily for peace of mind on the big disasters, and cheaper than S3 for my data volume.

    My backup server has triple mirrored stripes to store all the snapshots from all my machines (11T) and my first ZFS snapshot is from 2009.
    All my other machines run rsync continuously against a raspberryPi with a JBOD (3.6T) in my utility closet. The raspberryPi syncs with the ZFS data volume once every 24 hours.

    Restoring a file is a chore, but not impossible. I did stop with minute and hourly based snapshots. ZFS provides protection against bitrot and other nastiness, I have just recently has a disk die on me, which I easily replaced.

    • by ccool ( 628215 )

      Similar to this guy.

      I use FreeNAS "Raid 10" equivalent (stripe of 2 mirrors). This machine is getting backed up once a month on another freenas with a single 8tb drive. I also use Backblaze B2 for critical data once a week, using rclone and encryption. I used to backup to my folk's house also, but supporting hardware from a far is getting problematic.

      To keep data in sync with every PC, and keep thing accessible, I am now using Nextcloud on my machine. The data storage is the FreeNAS server.

      I'm still pl

    • by hjf ( 703092 )

      Have you considered Backblaze B2? In storage size they are (or were) the cheapest. I pay around $1.5/mo for 300GB. It's an S3-style service with "object storage". I use it along with rclone to sync.

    • Similar but less intense -

      Local machine has software RAID-5 going for my /home directory.

      My only truly important files are my yearly taxes and such, plus pictures from my phone and so on. All of which gets copied to an external drive, and swapped with a friend. I keep his in my fire-rated gun safe at my house, he keeps mine in his at his house. We meet up every 3-6 weeks to go shooting or fishing, and we exchange drives every 2 or 3 meet ups.

      The wife's laptop with her camera pics, etc. gets rsynced over

  • by gnasher719 ( 869701 ) on Monday March 08, 2021 @07:55AM (#61135554)
    Time Machine with two drives: One hard disk, locked into a drawer with just the cable coming out, and one largish SD card plugged into my Mac permanently. Both encrypted obviously. So if one drive goes down, I'm still safe (it would mean that a new drive gets ordered ASAP); I'm safe if my MacBook is stolen when I'm on the road, I'm safe if burglars steal my MacBook because the backup drive is locked away and burglars are stupid.

    If my house burns down I have bigger problems.
    • by Wolfrider ( 856 )

      For Mac: Carbon Copy Cloner has saved my ass multiple times - full bare metal backup. Well worth the registration $$ and you can boot from the backup drive. I also try to save an image to ZFS once in a while.

      For Linux: fsarchiver with custom scripts to restore (also restores to VM.) Got this method from the Crunchbang forum a few years ago and it works really well.

      For Windows: Veeam has a free bare-metal backup and restore that works well with Samba and can restore to different hardware - but I don't really

  • My desktop and laptop use TimeMachine to a couple of SMB3 Shares on a Synology DS1515+.

    The whole Synology (including the TimeMachine shares) is BackedUp to 2 portable USB3 HDDs. One is kept off site, one on site. Each week, the one which was off site becomes on-site and vice versa.

    Only drawback is that my total storage is currently limited to 20TB (the biggest HDDs available for backups at the moment). Since my needs are much less than 20TB, that's not an issue yet.

    Easy peasy.

    PS: Here in Venezuela, upload s

  • by chrish ( 4714 ) on Monday March 08, 2021 @07:59AM (#61135558) Homepage

    I might be a little paranoid (or just paranoid enough), but I've killed so many disks over the years that this seems reasonable.

    I back up my home drive thusly:

    • Frequent backup to my second internal hard drive.
    • Less frequent backup to my NAS.
    • Less frequent backup to Backblaze's B2 butt.

    For large, mostly-static data like my music collection:

    • Daily backup to my NAS.
    • Daily backup to Backblaze's B2 butt.

    The music collection is also backed up to two different external drives, mostly so I can listen to music at work back when we lived in the office. One of them lives with my work machine at home now because my local network used to be too flakey/slow for streaming music sanely.

    When I was on Windows, I used FreeFilesync, Bvckup and Backblaze, both excellent products (not affiliated, just a happy customer).

    Now that I'm on Linux, I use FreeFileSync, Duplicati and rclone. I use rclone for my music collection because it's much simpler and my music doesn't need to be encrypted or packed up into archives or anything.

    If you're on Mac/Windows and want simplicity, Backblaze is an excellent choice. Just set it up and forget about it until your hard drive dies or you delete something you needed.

    If you're on Linux,Duplicati is extremely flexible, and rclone is a great choice for command-line usage. rclone supports a fairly ridiculous number of butt back-ends and can be used locally as well.

    The most important thing is to actually be doing backups. You can get an external 2TB USB drive for under $100 CAD these days, and butt backup services can be surprisingly affordable (I think Backblaze is something like $60/year for unlimited storage).

  • by thegarbz ( 1787294 ) on Monday March 08, 2021 @08:03AM (#61135568)

    I can't help but feel like the OP moved from a tried and tested system with many benefits to some fancy new thing only to identify some shortcomings.

    The discs offsite method has some great benefits including:
    - Large storage capacity
    - No need to rely on any tools, use what software you want.
    - Encryption and security is 100% within your control.
    And the really big one: HUGE amounts of bandwidth. Cloud services sound great right until the moment when you suffer a complete and total loss. Then you're left with the fun activity of download terrabytes of data to recover your system.

    For versioning / file management you could also look at hybrid solution such as running owncloud, seafile or similar cloud system which can maintain versions, or using a filesystem with support for file versions and a backup methodology that doesn't nuke this data.

    • I can't help but feel like the OP moved from a tried and tested system with many benefits to some fancy new thing only to identify some shortcomings.

      I used to store discs off-site in my office. I've not been there in a year, so I found the off-site method a bit less practical than it used to be :/

      The discs offsite method has some great benefits including:
      - Large storage capacity
      - No need to rely on any tools, use what software you want.

      I think a variety of software works. I use borg+rclone. Others use dupli

    • by hjf ( 703092 )

      Not everyone has an offsite location to store their data. Offsite, to be truly effective, has to be "in a different climate" basically. I live in a city that may or may not flood some day. Floods are common and the city is built in the "extended" river bank. Even if I took my data to an offsite location, unless it was a tall building, it would probably flood if my house does (because my house is built on higher ground).

      "A friend's house" is not really an option: you need a friend you can trust a hard drive

  • by h33t l4x0r ( 4107715 ) on Monday March 08, 2021 @08:11AM (#61135578)
    I just avoid writing to disk whenever the ssd seems temperamental.
  • I rsync to an external drive, with daily snapshots using the hard link trick and then weekly backups are full independent backups. I send a copy out of the house to another server I have in a different city, but that's really just because I can.

    I have a perl script that manages all this.

    I used to also have a 5-minute backup system which was great when I was writing, but Emacs' undo-tree more or less made it obsolete.

  • I develop apps on Apple, so I don't know how useful this will be. But my income is derived solely from my apps, so I can't lose anything.

    So on my Mac laptop I typically copy a source library sideways for safety while developing. I also run git locally.

    I have a 2011 Apple Time Capsule which I recently had to replace the hard drive in because the backups started dying, but it now reliably auto backs up the mac, going back several months now.

    I also run Backblaze on the laptop for a remote backup.

    There is also

  • I have not moved away from anything, so it just keeps piling up. That kind of redundancy is exactly something what I want to have in backups, maybe something that you should want as well.

    The last thing I did was buying a couple of WD Purple drives that I use in a NAS and where I do my backups manually. The drives weren't cheap and the process takes some additional time out of my schedule, but it gives me some piece of mind. That makes it worth to me, whether it makes it worth to you is up to you to decide
  • rdedup supports encryption (including public key), compression and deduplication.

  • by Tx ( 96709 )

    Scheduled wbadmin full image backup to internal backup drives in my two main machines. I don't have anything that needs backing up on my personal laptops. No offsite backups, so if I get robbed or the house burns down, I guess I'll just have to rebuild my porn stash from scratch.

  • First: I have a NAS with several TB of disk space. The Libraries on Windows (Documents, Downloads and Pictures) are mounted from the NAS. That way e.g. save games, word documents are always stored on the NAS. There are some other directories, I also sync to the NAS.

    Second: every two month I make complete backup from the NAS on a 10TB disk. I put that disk into a safe storage at my bank.

  • Occasional rsync from the SSD to one of my large/slow/cheap/local spinning hard drives. Simple, primitive, and effective.

  • I use an Apple Time Capsule wifi router since several years that backups without me needing to worry or even think. In addition iCloud helps as well but is esp good to sync devices.
    • I use an Apple Time Capsule wifi router

      I wanted that; Apple killed the product. It's a shame; it seemed like such a great solution.

      • The idea was good. The hardware overheated and erased its settings once in a while. I never owned one but helped recover settings on a few. These days, Time Machine works pretty well over any SMB3 share as long as you have the right mDNS broadcast to announce that it's a Time Machine drive.

        For the limited Apple hardware I have, Synology has this built-in.

  • I use a Windows tool called SyncBackFree (because its so easy to use) and a WD MyPassport 2TB external disk that lives in a drawer.

  • I have two Synology NASes on the LAN, each running a different RAID configuration. These along with two external HDDs provide backup space for large and smaller files.

    I use DropBox as off-site backup for crucial files, along with GitHub as a kind of backup for most of my projects. For small files like text files (LibreOffice or source files) I use floppy disks as well.

    Important (document, projects) directories are synchronised between desktop and laptop systems, and I'm considering doing something with
  • I have 2 QNAP NAS drives, a newer TVS-672 and an older TS-670. Both are 6-drive NAS servers and both are configured with 6x12Tb Western Digital "Red" drives... Each of these drives gives me about 43Tb once formatted and set to RAID6.

    When I looked at my data, I discovered that it falls in to 2 broad categories: a relatively small volume of highly volatile files [stuff I touch every day, such as documents]; and a relatively huge volume of files with next to zero volatility, such as photographs, music, vide
    • by AmiMoJo ( 196126 )

      With this amount of complexity I do wonder if it would be cheaper to use cloud storage. It's very cheap these days and they take care of all the drive handling for you. Plus you can have more frequent backups, e.g. mine are daily on volatile data.

      If you are paranoid just use two different ones. Encrypt everything of course. The main issue is if your internet connection is very slow.

      • by ytene ( 4376651 )
        Yup...

        But it isn't actually that complex... and it sort-of evolved as I acquired the tech. When I had one NAS I just ran regular external backups. Now I have two, I can be a bit more flexible. Literally the only thing I have to do is remember to load bare drives into the external drive hoppers on Friday and Saturday - hardly a burden.
  • For files you use (and change) regularly, Seafile is the best solution. Hostable either on a machine you own or in the cloud. For larger files (photos, videos) and for backing up the Seafile installation above, rdiff-backup is the way to go. Duplicity would work too, but can be extremely slow even over high speed connections.
    • by gweihir ( 88907 )

      I have made good experiences with rdiff-backup in the past too. I use a pair of Linux vservers as target though. They also provide DNS, Email and Web-Servers for my personal use and are cheaper than online-storage, at least here.

  • All drives in PC imaged (Acronis True Image) to external USB drive that's only powered up when performing backup. C drive image additionally copied to NAS and uploaded to cloud (encrypted image file). Folders with important data on other drives than C are backed up to cloud storage (Jottacloud). Google mail is archived using MailStore.

    That's for my primary Windows machine that has data on it that I care about. I also use a Mac laptop that I don't bother backing up at all, as there's no important data stored

  • Going to local drives and cloud locations.

  • I use backuppc installed in my home server with a external 3TB disk (but a RPi with a external HD also works) and every week i remove the disk, go visit my parents and leave it there. Pick up the previous week disk and plug in to the server. I actually made a udev rule to mount disk and start backuppc when the disk is plugged.
    So other than the disk rotation, it is a plug and play setup now.

    Backuppc uses rsync+ssh to backup all my linux machines, even remote ones... and i do not have windows anymore, but in

  • StoreBackup is my primary tool. I like the fact that it does versioning and hard-links for things that haven't changed. That means that you can have (say) a daily backup for the past month, a weekly backup for the past year, and a monthly backup for the past 5 years - and it all fits on one disk. Then do a simply copy of that disk to a removable drive, and you have your offsite backups.

    That's my primary solution. I also do a direct disk-copy of the NAS, onto another removable disk. That's my secondary offsi

  • Whatever the default backup program is for Ubuntu is what we use at home (me/wife/kids all run Ubuntu). These are just "play" machines that get a daily incremental to their local NVMe drive. Once a week, I use rsync to copy everything over to a 4TB USB drive that hangs off a raspberry pi 4 (also running Ubuntu). My "offsite" component is that I have 3 x 4TB drives. Every couple of months when we go to visit my folks, we bring the one that's got data on it to their house for "safe keeping" and swap in th

    • Whatever the default backup program is for Ubuntu is what we use at home

      I looked at that, I think it's the GNOME one. GNOME decided to remove support for "business clouds" (i.e. things like S3, GCS) and only support things "for normal users", i.e. the much more expensive and weird google drive and so on. I found that deeply irritating especially as the underlying software supports those.

  • I use BackBlaze. It continually backs my MacBook up to the a cloud server, and has a 30 day versioning. Simple, reliable, and it never slows my computer down.

    I also have a nice big external drive that I keep offline file on, and I allow TimeMachine to create backups there as well. That drive is also backed up by BackBlaze.

    In the event of a catastrophe, I can contact BackBlaze and they will send all my data on a hard drive to me. I can send the drive back, or pay for it and keep it. Either way, it's a g

  • by regoli ( 530486 ) on Monday March 08, 2021 @08:56AM (#61135744) Homepage
    External USB3 drive bay, 3TB SATA drive and Carbon Copy Cloner https://bombich.com/ [bombich.com] on my 2017 iMac
    • by Wolfrider ( 856 )

      CCC is the thing to have for Macs. Saved my ass multiple times and paid for itself when I can boot directly off the backup drive and immediately have my entire environment setup. $39 is NOTHING compared to peace of mind and ease of use.

  • iDrive.com (Score:3, Informative)

    by Nondidjos ( 4359161 ) on Monday March 08, 2021 @09:02AM (#61135764)
    I have been using iDrive.com for years. It is ~$70 / year for 5TB of cloud storage. I chose them because at the time they were the only ones I found with linux support. But their linux support turned out to be very good. Icing on the cake: no limit to the number of devices backed up, file sync service, snapshot and versioning (as long as it fits in the 5TB), sector clone and restore option for Windows, full remote management through web interface. (Sorry to sound like advertising but I am very happy with their service)
  • I only have a couple computers worth backing up, so the free Veeam edition works fine. I have a random workstation that does the backups and also copies to an external drive. That being said, I don't think there is anything I have on a home system that would be critical if I lost it. It's more for convenience of not having to reconfigure a replacement system.
  • Two of them with different providers. Much cheaper than online storage. Also, everything I care about is in a Subversion repository on a Linux server that gets backed up nighly (using cron, svndump, gpg, and scp) to these two vservers.

    In the past, I have had a 2nd automatic solution with rdiff-backup (worked fine for many years), also to these vservers, but I only use that one manually for some larger things these days.

  • My NAS has a backup utility that continuously backs up my data from my PC, and then it does nightly backups of itself to a same-brand NAS at an offsite location (and their NAS backs up to mine).
    Everything is versioned and not directly accessible from the PC so ransomware can't get at it.
    I recently replaced my wife's laptop, and did a full restore from this backup and it worked great.

  • I use a Synology NAS for local storage. I backs up nightly to a USB3 HDD. I then rotate the removable drives to off-site storage.

    I've been thinking about AWS Glacier. Anyone use that?

    • I use Time Machine to a USB disk I keep in a draw. Once in a while I create a zip each of ~/Documents and ~/Pictures and copy it to S3 glacier storage using Amazon's command line software. Handy as I live in an earthquake zone. I hope never to need the S3 data but it is there if I need it.

    • I fell out of love with Glacier because:
      - It's slooow
      - Restores are painful (and expensive), and whilst I don't need restore more than once every couple of years, it's got to be easy otherwise you'll never test it
      - The "UI" is terrible (although with an S3 bucket you can put a nice UI over it, but now you're into stitching AWS services together and of course have to pay for it all

      If you want archive storage, then Glacier may be okay, but check out the competitors because some are considerably better. Person

  • Two identical NAS, one wakes up and backups the snapshots from the other, then shuts down (you can do this automatically, but I manually trigger it instead). And then any other devices put anything important on the main NAS. Then anything critical is synced to the cloud or my own personal external server (VPS with lots of storage are dirt-cheap nowadays).

    You'd need a two-drive failure on the primary NAS to have to resort to the secondary NAS, and then a two-drive failure on that to make it unrecoverable,

  • And usb-3.0 attached external dirves. Two of them, so even if high voltage surge would kill my computer at the time I'm backing it up. yesterday backou would survive.

  • Google Drive/OneDrive linked to my Users Documents folder oddly enough it good enough for my home use.

    If the rest of my systems goes out. you can always reinstall the OS and the Applications, it is just your Documents directory is the biggest loss in the case of a major problem.

    Now if the poster is actually running a business off his home network, and is trying to ping Slashdot on some cheap and easy backup solutions for his business. My option is not acceptable for a business case. For a business you wil

  • Seagate has nifty backup tools for use with their disks. So I have a large one for incrementals, and a pair of smaller ones for weeklies that get rotated through the fire safe.

    When I was on Mac I used Time Machine (and why hasn't anyone stolen that for Windows or Linux?) and SilverKeeper for the weeklies.

    Back in the Linux days I used, IIRC, rsync, and tar.

  • I’ve gone through two generations of Apple TimeCapsule, and mine is still alive and kicking. This automatic (“fire and forget”) backup device is literally the reason I started buying Macs for my family. Never looked back until recently... now that Apple has discontinued them. :-( ‘Course when my kids are away at college, this system doesn’t work so well. Thank heaven for the Cloud!
    • You can build your own Time Capsule without too much effort. Just need to create a sparse bundle on a network drive, then possibly run a command from the Terminal to set it as your Time Machine backup destination. I’ve been doing it ever since I pulled my Time Capsule from service.

  • Depending on what I'm saving it differs ...

    * multiple VMs on private ESX hosts - Veeam B&R to a NAS
    * folders I want replicated on multiple hosts - stored in Seadrive w/ history
    * Win10 notebook - Veeam Agent also to NAS

    Still have a couple Linux folders backuped via BoxBackup

  • I have nearly a couple TB of personal data, documents, family pictures, family videos.

    Today, I use Duplicati which backs up to Google Drive (GD was the best option at the time I started). Duplicati supports tons of cloud storage providers, runs on all the major platforms, and well maintained docker containers exist (highly recommend the docker solution). It supported everything I was looking for: good list of supported cloud providers, good price (free), open source, encrypted, deduplication, emails on co

  • I use BackInTime to a local USB hard disk, so I get versioning. https://github.com/bit-team/ba... [github.com]

    I don't have offsite, though I know I should.

  • Most of my important work is on one machine. It has RAID-1 to guard against disk failure. I then back it up onto a set of RAID-1 USB drives attached to the box. And then a second RAID-1 USB pair because I happen to have them, and why not?

    Next, it gets rsync'd daily to a machine a floor below my office in my house, also with a RAID-1 pair.

    Finally, I have a Raspberry Pi at my sister's place. At night, everything gets rsync'd there onto an encrypted filesystem on a USB-attached RAID-1 pair. That's my

  • Carbon Copy Cloner (Score:4, Informative)

    by MikeMo ( 521697 ) on Monday March 08, 2021 @10:01AM (#61135956)
    CCC is a rock-solid, full-feature “cloning” tool which uses rsync (and I think psync), but it’s much more than a scripting tool. It will run scans for corruption, mount and unmount drives, run pre- and post-scripts and much more.

    Even though it maintains a “clone” image on the second drive, it also optionally keeps a folder containing deleted items, so it also works well as a backup.
  • I've been around the block on home backups a few times. Nowadays I do the following:

    - laptops and phones backup to the NAS (TimeMachine on my mac, File and Directory doodah on Windows, SyncMe on android phones)
    - NAS is a 4-bay QNAP, which can run containers. The QNAP is massively over-complicated, but the containers are pretty neat. A container runs Bacula and pulls data from the NAS (which can run the File Daemon natively) and a couple of cloud VMs I own (bacula can "run before" it backs up, so a script ta

  • This question is asked, oh, somewhere between once every 6-12 months on slashdot, and it always gets the same groups of answers ("I use my own hardware because NSA", "I use my own hardware and have an offsite mirror", "I use cloud provider X and trust their disaster recovery", and "I use a combination of five different hard drive vendors and a cascaded set of backups onsite and offsite and in an abandoned nuclear missile silo"). Having read these responses probably half a dozen times in the years I've been
  • I use Syncthing. It works on Linux, macOS, Windows. There is no "master", but it is P2P so any machine can disappear and it does not matter. It can "relay" when both machines are behind firewalls. When I install a new machine I just pick what folders from syncthing I want (Music, Work, Pictures...), it pulls them, and then it is effectively another backup node. With a machine off-line I am protected against deleting my own data too.

  • I use the NextCloud desktop client to back up to my NextCloud server. That server stores the data on zfs disks with snapshots enabled for versioning. That entire server is backed up to tape once a week.
  • I use burp on an external server (run by my employer), backing up all my data files. The backup time can of course be scheduled but I have set it for manual activation, to prevent large data transfers at an unsuitable time. My chief complaint about burp is that it's not resilient against a dropped connection. It will attempt to resume an interrupted session at the next start, but will not retry automatically. We've never tried to restore from my backup so I can't comment on that procedure.

  • I use manual rsync at home, but on my AWS server I use Rsnapshot which is a wrapper around rsync that mostly manages incremental backups. For my mom I set her up with Duplicati which uploads to AWS S3.
  • I use Audio CDs, storing my data using a steganography tool like Deep Sound. I then just write the name of an album on them and put them in my Audio CD binder.

  • I'm a bit ashamed to admit this, but i use "Google Backup and Sync" on my Win10 user folder, and additionally my free Dropbox account for the very important bits.

    I don't mind paying Google a bit per month for oodles of storage, and I actually like the Google ecosystem. Dropbox is nice, because it has rudimentary versioning in an emergency.

    I develop via a self-made ACE editor directly on my servers, and those programs are rsynced between servers each night.
  • Time Machine and FreeNAS. FreeNAS gets synched to Dropbox nightly.

  • by I75BJC ( 4590021 ) on Monday March 08, 2021 @10:56AM (#61136196)
    https://bombich.com/ [bombich.com]

    Used it for years (maybe a decade now) and it works great! Bombich has added nice and functional features down through the years. Carbon Copy Cloner will make bootable copies of a HDD/SSD and do other types of backups.

    I recommended to all my families and friends.

    Additionally, I used cloud storage for my files.

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...