Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Businesses Data Storage IT

Ask Slashdot: Open Source Back-Up Tool For Business? 118

New submitter xerkot writes: I am looking for a tool to make backups of PCs in a big company. We want to replace the one that we are using at this moment for this new one. The tool will be used to do backups of PCs (mainly Windows, and a few Linux), and we want to manage these backups centrally from a console, being able to automatize the backup process. The servers of the company are backed up with another tool, so they are out of scope. In the company we are being encouraged more and more to use open source software, so I would like to ask you, what are best open source tools to do backups of PCs? Are they mature enough for a big company?
This discussion has been archived. No new comments can be posted.

Ask Slashdot: Open Source Back-Up Tool For Business?

Comments Filter:
  • by i.r.id10t ( 595143 ) on Tuesday November 10, 2015 @09:40AM (#50900065)

    What exactly are you backing up? Entire disk images? Or just user files?

    If disk images, then something like clonezilla, perhaps set up to boot from a TFTP server. Boot the machine via WOL, kick off the TFTP, automatically dump the image out to a server using the machine name or MAC address or something as a unique identifier

    For user files only (ie, My Documents or whatever) can you set up network based home directories ? And then just back up the server they live on.

    • by Feneric ( 765069 )
      Agree, in my experience Clonezilla [clonezilla.org] does as well in this arena as the commercial offerings. Be wary, however. All of these products can behave a little funny sometimes in certain environments. Whatever you choose, test it out all the way through a backup and recovery before relying on it.
    • For user files only (ie, My Documents or whatever) can you set up network based home directories ? And then just back up the server they live on.

      If you do this, you better have an insanely reliable network with at least gigabit speeds. Where I work we have all of our *nix boxes set up with the user home directories on NFS and when the network goes out, we have a lot of employees who can't do their jobs. Even a few hours of outage can be extremely costly. On top of that, no one who works with large files uses their home directory for anything due to the limited space and slow (100Mbps) network speeds. Instead everything is stored locally and back

      • Yes, but once you have a Windows network to the point where anyone can sit down at any workstation, with all user data being stored on a file-server, you no longer have to back-up the user workstations. I managed several networks of between 50 and 200 workstations, each with a single Samba-based DC and file-server that was backed up off-site via rsnapshot. The workstations were imaged via FOG. Once we had a working image for a workstation (which could take a couple of days to prepare from scratch), the bot
      • Learn about "Offline Files". It will alleviate the problem you describe.
    • we want to manage these backups centrally from a console

      You don't want to do that, you only think you do. That needlessly creates a security problem by intentionally installing a remote control vector. What you want is a central notification system so that if a machine expected to back up does not, you find out about it and can send on the IT staff.

      What i.r.id10t alludes to is also correct: figuring out how you can restore a machine to operation is step one. That determines what form the backups can be stored in which, in turn, determines what backup tools you c

    • Or, just set up CrashPlan and be done with it, and not make the mistake of having only on-site backups.
  • Just look for the best tool for the job, and don't worry about whether it's open source or not.

    • by martiniturbide ( 1203660 ) on Tuesday November 10, 2015 @09:51AM (#50900127) Homepage Journal
      Open Source is not only an ideology, it is a tool to reduce the risk of dependencies on a single vendor. I found that his question is very valid if he wants to reduce vendor risk.
  • amanda (Score:2, Informative)

    by Anonymous Coward

    you can use amanda in case you want to backup files. amanda is production grade and has clients for windows and linux and
    possibly unix alike. -- mallah

    • Re:amanda (Score:4, Informative)

      by rklrkl ( 554527 ) on Tuesday November 10, 2015 @12:28PM (#50901503) Homepage

      Amanda is great if you're backing up only Linux clents, but the Windows Amanda client is a total abomination. Not been updated in over 2 years (don't believe the version numbering - check the timestamp of setup.exe), pointlessly uses a MySQL DB on the client side (ridiculous!) which the Linux client *or* server doesn't use, regularly times out, regularly crashes, produces byzantine error codes without any description/documentation of them, much slower than the equivalent Linux client and produces ZIP64 backups that are pretty well impossible to extract from on Linux (which is annoying, because the Amanda server side is Linux-only).

      We have a Ultrium tape drives with multi-slot autoloaders/barcodes, so there are very few Open Source backup solutions that can handle this (including tape-spanning if needed) out-of-the-box. Bacula was another potential solution, but it's horrific to set up (makes the dreadful Oracle DB install process look like a breeze) - just reading the Bacula install docs brings me out in a cold sweat :-) I guess business-level backups just aren't sexy enough to warrant a decent Open Source solution...

  • Don't cut corners (Score:4, Informative)

    by cloud.pt ( 3412475 ) on Tuesday November 10, 2015 @09:50AM (#50900123)
    I once was in a crossroads of choosing between stuff like Clonezilla and Bacula, for small business purposes. Bottom line is they add a lot of complexity for low to no flexibility. I ended up building my own tar/move/ script with cron triggers at after ours downtime, then I would simply move them around network locations for avoiding single points of failure messing up the backups. Adding your own exceptions for the backup is a plus. At the last point, I had something reliable, fast, and that would require the simple overhead of re-installing Debian before the actual restore, then an update-grub and a change in fstab for the new disk replacing the broken one's UUID (because you don't really do that many restores so it's a fair trade-off, while you do save time exponentially by not backing up the entire OS). A good starting point is http://www.aboutdebian.com/tar... [aboutdebian.com]
    • Looking at your question again, this might not be your best bet due to scale and usage of Windows. Still a good choice for the Linux, 5-machine tops sysadmin.
      • triple post, but just noting my compressed backups for a 100 mailbox, jboss instance with heavy app deployment and traffic, and fully configured ISP-config would range between 2 to 4GB. I had a 30 day rotation set in place, then every 6 months I would run some cleanup tools to go back to 2GB'ish archives.
    • by Anonymous Coward

      I once was in a crossroads of choosing between stuff like Clonezilla and Bacula, for small business purposes. Bottom line is they add a lot of complexity for low to no flexibility.

      Talk about not having a clue!

      Bacula is highly, higly customizable: for example, I wrote a program in shell which connected Bacula with Oracle RMAN, and now Bacula obs kick off RMAN database backups every day, then syphon the RMAN backups off to tape. And Bacula controls the tape library and the barcode reader through other external programs. Such a thing would not be possible were Bacula not designed to be so flexible as to interface with everything and anything, including emulating virtual tapes as entire

      • While you do present very fair arguments, you failed, like me, to address important issues of the OP. Not to mention your inconsistency on such examples.

        For instance, you argument for customization with an example based on the trigger of a proprietary recovery system. So you can trigger a remote process with Bacula. I have a one-liner shell script command for that, nice job. Especially with a proprietary system which the OP specifically excluded in the title (Open. Source.). And from the clues I got, Bacula

  • Backula I think would do it, but you have to pay for it. If you are looking for free, you will have to learn how to script your own agents. For a small business (25-50 PC+) Windows Server Essentials is Boss! It boots a tftp server for recovery and nicely manages all the machines it backs-up. Conversely, You should centralize user data, so if their PC goes down, they don't lose any work. Then just keep a windows PE custom install of the PC's when they are build fresh and you can just do an image restor
  • Bacula (Score:4, Interesting)

    by Trevin ( 570491 ) on Tuesday November 10, 2015 @09:53AM (#50900145) Homepage

    I use Bacula for my home computer; it feels powerful enough for a small office, and is very versatile.

    It has three main components: a client daemon that you install on the computers you want to back up, a storage daemon that you install on the computer that will write the backup files and/or tapes, and a director daemon which controls the backups. The director and storage daemons only run on unix-like operating systems (BSD, Linux, Solaris) but the client daemon has also been built for MS-Windows.

    http://blog.bacula.org/ [bacula.org]

    • by mridoni ( 228377 )

      I regularly use Bacula to back up a network of about 20 workstations and a couple of servers, plus a remote one (connected through a VPN). It's fast and reliable, if I really have to nitpick there are some quirks, though they depend mostly on lack of disk space for the backups (this is a non-profit with a tight budget) and lack of time on my side for fine-tuning the configuration. I'm really satisfied, since 2008 Bacula has saved our ass a number of times with minimal effort and maintenance.

  • by cerberusss ( 660701 ) on Tuesday November 10, 2015 @10:01AM (#50900201) Journal

    I worked at a scientific institute, and they simply installed OwnCloud everywhere. It's got a client for most platforms, syncs to a server, and allows you to back up the server in the usual fashion.

    It worked so well, that when I started doing consulting (at the client site), I got my own VPS with Debian, and installed OwnCloud server on that. Then installed the client on my private laptop and the laptop that I got from the client. Works beautifully, because communication is over HTTPS. Company firewalls don't block that. I tried other things like BitTorrent Sync, but these use special ports.

    • This seems like the "kill a fly with a canon" solution. But it does look like it could work if money and infrastructure is no object, with full-fledged system images as a plus.
  • by Anonymous Coward

    They use this around here, seem to like it. Seems apt to your situation.

    http://backuppc.sourceforge.net/

  • by Anonymous Coward

    I have been using Fog and OwnCloud.
    Fog (Free open ghost) for desktop backup and reimaging and OwnCloud for data backup.
    http://sourceforge.net/projects/freeghost/

  • FOG [fogproject.org] can back up disk images and do inventory and scheduled re-images and keep images for things like AV scan and whatnot.

    We mostly just use it to wipe machines back to clean slates ourselves but it's supposed to have quite a rich feature set. It does also have some bugs though.

    I set one up at my school that has re-imaged 1000+ student netbooks 7 at a time, the school district also has all of our images centrally located and available throughout the district for all models of laptops/desktops that we use and

  • Amanda and S3 (Score:2, Informative)

    by mi ( 197448 )

    Use AMANDA [sourceforge.net] to do the back-ups. Use Amazon's S3 to actually store the dumps compressed and encrypted at the source — AMANDA has had the S3 back-end [zmanda.com] for a while. No, you do not need "Amanda Enterprise".

    Having set just such a thing up at my last job, I'd be happy to help you out for a regular consulting fee. Should not take more than a week or two even on a large organization.

  • by enjar ( 249223 ) on Tuesday November 10, 2015 @10:43AM (#50900551) Homepage

    Set up home drives on file servers and back those up. Teach users that those are the only locations that are backed up. Set up the PCs to use that as the default home location. You can do this on Windows and Linux just fine. Invest in the server -- redundant power supplies, RAID arrays, failover, etc. You could even look at various open source NAS devices, or whatever works for your environment.

    Why?

    Backing up user PCs doesn't scale well and becomes a thankless task for some poor employee who has to keep up with broken backup clients. It's far easier to scale when you only have to keep up with the file servers. You have some number of clients saving to each server, but that's that number of backup clients you don't need to deal with. This frees up IT staff for other, more useful tasks.

    It also allows you to replace end-user PCs with a simple re-image rather than trying to recover or fix anything. End-user calls and says their PC is going whacko, you pull a spare off the shelf and lay down a fresh install. Show up, take the malfunctioning equipment away and diagnose it on your time, while they get back to work. Since all the files are on the server they can just get back to it rather than waiting on you to try and fix whatever might be going wrong.

    • by swb ( 14022 )

      I agree with this but I can't tell you the number of times, even where reasonably well implemented, it resulted in A User of Some Organizational Clout losing some data they wasn't saved where it belonged. Mostly this was ultimately consequence free, but getting to that state often required some kind of extensive recovery.

      It's also iffy for mobile users, or at least was kind of iffy prior to the proliferation of cloud sync options.

      I kind of wish you could do some kind of automated desktop backup of the user

      • by enjar ( 249223 )

        Fortunately we have buy-in from the top on this strategy, and they also eat their own dogfood. So people bitching about it can complain all the way to the CEO and he will say "too damn bad, that's how we do things, and you were told about it on multiple occasions"

        For mobile users there are far better options available now than there ever used to be. For Windows mobile clients we implemented DirectAccess and some other stuff that automatically takes care of a lot of the syncing stuff. We also keep all email

  • Can be used entirely for free and encrypted backups. The CrashPlan makes money with their cloud backup features. You can back machines up to a single machine or multiple machines on your network that have CrashPlan installed one them. You can run in a scheduled mode or do realtime (every 15 minutes?) backups.

  • As much as is feasible, store files on the servers you have already.

    I realize this may not be feasible if your "daytime bandwidth" or latency makes it impossible, but do it if you can.

    I'll leave it up to others who know more than I do to answer your original question about open-source, centrally-managed, business-grade (read: vendor-supported and hack-resistant) solutions.

    Oh, one more thing: this is a business. Unless you are going to dedicate a programming team to bug-fixing this and a security team to r

  • Anyway, I use BackupPC [sourceforge.net] to backup user files from Windows machines on the company network. Works just fine. I tell my users it's "no guarantee", as they should store on the network shares any way.
  • Rsync is simple, lightweight, has been around forever, and gives you incredible power. Assuming by "manage centrally from a console" you mean that you have remote admin access to all the computers in the scope, it's as simple as a cron job running your Rsync script. You can trivially make several versions for different use cases (Linux vs. PC) and only have to configure the setup once in the cron job. After that, you only need to touch it if you make changes.

    Rsync can push deltas to any remote server you

    • Is there any free rsync SERVER capable of running under 64-bit Windows 7 Pro and working with any known Android rsync client over wifi (to an ext2-formatted hard drive mounted using ext2fsd)?

  • What about using ZFS and essentially foregoing traditional backups?

    I know that is heresy to traditional data-management methodology; but with ZFS' resilvering and anti-bit-rot self-healing capabilities, it would seem that, other than a fire or tornado hitting the server closet, or outright theft, that ZFS totally answers the need for traditional backup. And if you combine that with incremental backup to an offsite data-store (also ZFS?), then how wonderful would that be?

    I admit my interest in ZFS FAR ou
  • rsync and a few scripts. Perl scripts.

    • I would extend this answer a little bit. At a small business with a mix of Windows and Linux boxes I use Backuppc. It uses Rsync, and does de-dup and compression on the server side.
      • by rongten ( 756490 )

        Backuppc is exceptional for linux, but for windows and the infamous pst files, better to have a client/server architecture of the BURP software.

        Same server, running backuppc and burp, and you save to tape the whole backup FS every 6 months.

        You can use bacula for the last step if you so wish, we use straight copy to lto tapes.

  • Look, I know that "open source solution" is in the title. The low hanging fruit is already camping out in the thread - Bacula, Clonezilla, and script/cron/rsync are the major solutions there.

    If the business is okay with "free, even for commercial use", Veeam Endpoint Backup [veeam.com] is excellent. It will either back up to a Samba share or a Veeam B&R if you have one in the environment somewhere. It's legit freeware, and works very well.

    Even if not for this particular case, it works well for laptops. It's the onl

  • As usual.
    My company uses Bacula for all our server backups, and it works pretty well, once you beat the configuration into doing what you want.
    Some things about Bacula that I've noticed:
    1) It's scheduling is more than little rigid. I'm not using it on desktop PCs for that reason (the PC pretty much needs to be there when Bacula wants it to be, or you miss that backup cycle. As near as I can tell, anyway).
    2) Trying to configure the retention times for Bacula is NOT for the faint of heart. Get someone to help

  • I'll just throw this out there: I was tasked with the same requirement of backing up people's desktops 6-7 years ago and the solution that I went with was some home-grown Powershell scripts and using the built-in VSS service on the workstations.

    I grab the bare necessary files to rebuild a workstation and then dump the backups to the user's home directory on the server (which is then automatically backed up). Take a look at the scripts I wrote [thebutlerfamily.org]

  • I used Bacula to back up a company of ~25 users. It doesn't have all the bells and whistles of TSM. Once you figure out how it works, it's nice (and it's free). I ran the Bacula server on GNU/Linux (Redhat I think). There are clients for GNU/Linux, Windows, and Mac. Backups can also be encrypted.
  • Highly recommend UrBackup. It can do both files backup and system image backup. The backup server can run in either Linux or Windows. Clients softwares for PC to be backup are also available for both Linux and Windows. For Windows clients, system image backup is made using Volume Shadow Copy function of Windows. Not sure if system image backup is available for Linux client or not. Overall it is a very powerful, yet easy to use, backup system, avaialble at http://www.urbackup.org/ [urbackup.org]
  • The first thing I have to say to everyone who asks me to design a backup solution is "what's your recovery solution? what are your recovery needs?". Then design your backups around that. Don't back up anything that won't be restored. You have to protect against both disaster recovery (loss of total system) and operational recovery (file deletion, corruption, historical trails). DR for a PC is usually from a stock image; OR for a PC can be managed through much better methods than PC backup.

    The only thin

2.4 statute miles of surgical tubing at Yale U. = 1 I.V.League

Working...