


Ask Slashdot: Open Source Back-Up Tool For Business? 118
New submitter xerkot writes: I am looking for a tool to make backups of PCs in a big company. We want to replace the one that we are using at this moment for this new one. The tool will be used to do backups of PCs (mainly Windows, and a few Linux), and we want to manage these backups centrally from a console, being able to automatize the backup process. The servers of the company are backed up with another tool, so they are out of scope. In the company we are being encouraged more and more to use open source software, so I would like to ask you, what are best open source tools to do backups of PCs? Are they mature enough for a big company?
What exactly are you backing up? (Score:5, Informative)
What exactly are you backing up? Entire disk images? Or just user files?
If disk images, then something like clonezilla, perhaps set up to boot from a TFTP server. Boot the machine via WOL, kick off the TFTP, automatically dump the image out to a server using the machine name or MAC address or something as a unique identifier
For user files only (ie, My Documents or whatever) can you set up network based home directories ? And then just back up the server they live on.
Re: (Score:2)
Re: (Score:3)
For user files only (ie, My Documents or whatever) can you set up network based home directories ? And then just back up the server they live on.
If you do this, you better have an insanely reliable network with at least gigabit speeds. Where I work we have all of our *nix boxes set up with the user home directories on NFS and when the network goes out, we have a lot of employees who can't do their jobs. Even a few hours of outage can be extremely costly. On top of that, no one who works with large files uses their home directory for anything due to the limited space and slow (100Mbps) network speeds. Instead everything is stored locally and back
Re: (Score:1)
Re: (Score:2)
Re: (Score:2)
we want to manage these backups centrally from a console
You don't want to do that, you only think you do. That needlessly creates a security problem by intentionally installing a remote control vector. What you want is a central notification system so that if a machine expected to back up does not, you find out about it and can send on the IT staff.
What i.r.id10t alludes to is also correct: figuring out how you can restore a machine to operation is step one. That determines what form the backups can be stored in which, in turn, determines what backup tools you c
Re: (Score:1)
Never mix business and ideology (Score:2, Offtopic)
Just look for the best tool for the job, and don't worry about whether it's open source or not.
Re:Never mix business and ideology (Score:5, Interesting)
Re: (Score:1)
Because there's no such thing as abandoned OSS? If anything, you're more likely to see development stop and become orphan software on some OSS freeware. Also, things like operating system upgrades can cause OSS to suddenly stop functioning or start behaving erratically, often with little or no warning until the time comes when you need it. Often it is months before new versions are finally released to solve these problems. Backup software is not the area your company should be trying to save a few dollars.
As long as the source code is still available someone else can take over the project. I hate F/LOSS projects that shut-down and the source code is removed from the World Wide Web; at least move it to GitHub and post a link to the repository keeping only a redirector page active at the original domain with information about how to contact the now former maintainer to take over the domain and development.
Re: (Score:1)
You can pay any vendor to support your open source solution. It's illegal for a third party to fix the bugs in your proprietary software.
Re: (Score:1)
You can pay any vendor to support your open source solution. It's illegal for a third party to fix the bugs in your proprietary software.
I would believe it is only illegal if the vendor then turns around and tries to sell the fixed code to another.
Oh, and if the company that wrote it is out of business, who then has the standing to sue? The government might (probably does), under Copyright laws; but won't. So who?
Re: (Score:1)
I see you've been living under a rock and never heard of these things called patent trolls, or SCO, or Oracle.
And how, pray tell, are any of those entities going to find out about your private little contract with a vendor to fix your particular instance of whatever code?
Re: (Score:3)
I see you've been living under a rock and never heard of these things called patent trolls, or SCO, or Oracle.
And how, pray tell, are any of those entities going to find out about your private little contract with a vendor to fix your particular instance of whatever code?
Software audits. Oracle and Microsoft are famous for going on-site and sifting through computers looking for non-compliance. Non-compliance doesn't only mean using unauthorized copies of software, it also means unauthorized use of authorized software. And if that isn't enough, the DMCA can be used on you for reverse-engineering a product in order to modify or augment it. It also tends to "void the warranty" on your legitimate uses.
Open Source isn't necessarily the Solution to Everything, but most OSS has a
Re: (Score:1)
Non-compliance doesn't only mean using unauthorized copies of software, it also means unauthorized use of authorized software.
Depends on what is considered "Unauthorized use".
the DMCA can be used on you for reverse-engineering a product in order to modify or augment it.
Show me ONE case of that ACTUALLY happening. NO Jury would Convict, if the "reverse-engineering" was simply for the private use of an end-user.
I agree that this MIGHT be technically illegal; but in a practical sense, it would NEVER fly in Court; nor would you find a Persecutor with enough time on their hands to mess with it.
The Feds will typically only get involved if the "damages" exceed $100k. Most software that isn't Government-Contracted or Oracle-bas
Re: (Score:2)
IANAL, but I'm sure that someone could enumerate specific cases where things like that have been prosecuted. Consider what the RIAA and MPAA have done to people.
You're assuming a sane and reasonable legal system, and a lot of us doubt that such a thing exists. If there's a law, someone will abuse it.
Re:Never mix business and ideology (Score:4, Insightful)
There is as much abandoned OSS as abandoned closed commercial software.
The entire point of OSS licenses is that all software will be abandoned at some point.
OSS guarentees that you have a fighting chance to keep it working yourself or atleast migrate.
Re: (Score:1)
Really? "as much"? Wasn't there a /. story two days ago about an OSS dev not interested in supporting his own app unless he got paid?
There's got to be waay more OSS abandonware than closed source because the proprietary people get paid for their work and therefore work hard to make sure the money keeps flowing in. Whereas the OSS guy quits after he's satisfied scratching his itch and there's nothing new left to explore/clone.
Re: (Score:3)
No, that link was about a company throwing a fit and demanding that their bug be solved *immediately*
Re: (Score:2)
FOSS didn't start until the mid 80's and Linux (which gave FOSS real legs) wasn't a thing until the early 90's. There was so much abandoned commercial software at the point that FOSS became popular that FOSS will never be able to catch up. There are literally entire companies and classes of software that are just gone and lost to humanity entirely because copyright prevented the retention of the code.
Have you ever even looked at an abandonware site?
Re: (Score:3)
Really? "as much"? Wasn't there a /. story two days ago about an OSS dev not interested in supporting his own app unless he got paid?
There's got to be waay more OSS abandonware than closed source because the proprietary people get paid for their work and therefore work hard to make sure the money keeps flowing in. Whereas the OSS guy quits after he's satisfied scratching his itch and there's nothing new left to explore/clone.
I doubt it. I have shelves full of commercial "abandonware". Mostly just in case I need to fire up an antique system for the sole purpose of dealing with some dust-ridden data. At least with OSS, you can usually get the source and do something with it. If a commercial product is truly abandoned and the vendor is defunct, you've got to reverse-engineer it and hope patent/copyright laws aren't going to get you. If it's something worse - like an OS/2 version of a Norton/Symantec product, you can basically give
amanda (Score:2, Informative)
you can use amanda in case you want to backup files. amanda is production grade and has clients for windows and linux and
possibly unix alike. -- mallah
Re:amanda (Score:4, Informative)
Amanda is great if you're backing up only Linux clents, but the Windows Amanda client is a total abomination. Not been updated in over 2 years (don't believe the version numbering - check the timestamp of setup.exe), pointlessly uses a MySQL DB on the client side (ridiculous!) which the Linux client *or* server doesn't use, regularly times out, regularly crashes, produces byzantine error codes without any description/documentation of them, much slower than the equivalent Linux client and produces ZIP64 backups that are pretty well impossible to extract from on Linux (which is annoying, because the Amanda server side is Linux-only).
We have a Ultrium tape drives with multi-slot autoloaders/barcodes, so there are very few Open Source backup solutions that can handle this (including tape-spanning if needed) out-of-the-box. Bacula was another potential solution, but it's horrific to set up (makes the dreadful Oracle DB install process look like a breeze) - just reading the Bacula install docs brings me out in a cold sweat :-) I guess business-level backups just aren't sexy enough to warrant a decent Open Source solution...
Don't cut corners (Score:4, Informative)
Re: (Score:2)
Re: (Score:2)
Re: (Score:1)
I once was in a crossroads of choosing between stuff like Clonezilla and Bacula, for small business purposes. Bottom line is they add a lot of complexity for low to no flexibility.
Talk about not having a clue!
Bacula is highly, higly customizable: for example, I wrote a program in shell which connected Bacula with Oracle RMAN, and now Bacula obs kick off RMAN database backups every day, then syphon the RMAN backups off to tape. And Bacula controls the tape library and the barcode reader through other external programs. Such a thing would not be possible were Bacula not designed to be so flexible as to interface with everything and anything, including emulating virtual tapes as entire
You and what army? (Score:1)
While you do present very fair arguments, you failed, like me, to address important issues of the OP. Not to mention your inconsistency on such examples.
For instance, you argument for customization with an example based on the trigger of a proprietary recovery system. So you can trigger a remote process with Bacula. I have a one-liner shell script command for that, nice job. Especially with a proprietary system which the OP specifically excluded in the title (Open. Source.). And from the clues I got, Bacula
Re: (Score:2)
Re: (Score:2)
Ther is no such thing as a free sandwhich. (Score:1)
Bacula (Score:4, Interesting)
I use Bacula for my home computer; it feels powerful enough for a small office, and is very versatile.
It has three main components: a client daemon that you install on the computers you want to back up, a storage daemon that you install on the computer that will write the backup files and/or tapes, and a director daemon which controls the backups. The director and storage daemons only run on unix-like operating systems (BSD, Linux, Solaris) but the client daemon has also been built for MS-Windows.
http://blog.bacula.org/ [bacula.org]
Re: (Score:1)
I regularly use Bacula to back up a network of about 20 workstations and a couple of servers, plus a remote one (connected through a VPN). It's fast and reliable, if I really have to nitpick there are some quirks, though they depend mostly on lack of disk space for the backups (this is a non-profit with a tight budget) and lack of time on my side for fine-tuning the configuration. I'm really satisfied, since 2008 Bacula has saved our ass a number of times with minimal effort and maintenance.
OwnCloud, and back up that server (Score:5, Interesting)
I worked at a scientific institute, and they simply installed OwnCloud everywhere. It's got a client for most platforms, syncs to a server, and allows you to back up the server in the usual fashion.
It worked so well, that when I started doing consulting (at the client site), I got my own VPS with Debian, and installed OwnCloud server on that. Then installed the client on my private laptop and the laptop that I got from the client. Works beautifully, because communication is over HTTPS. Company firewalls don't block that. I tried other things like BitTorrent Sync, but these use special ports.
Re: (Score:2)
BackupPC (Score:1)
They use this around here, seem to like it. Seems apt to your situation.
http://backuppc.sourceforge.net/
Re: (Score:2)
This has an additional problem, the Windows backups aren't encrypted. Not good if you have sensitive information.
<plug>Throw rsyncrypto [lingnu.com] into the mix</plug>
This has the downside of being a preprocessing step (i.e. - you need local storage for the encrypted form of the files), but solves the encryption problem better than your suggestion (which encrypts the transit, but not the actual backup).
Shachar
Awesome backup solution. (Score:1)
I have been using Fog and OwnCloud.
Fog (Free open ghost) for desktop backup and reimaging and OwnCloud for data backup.
http://sourceforge.net/projects/freeghost/
Free Opensource Ghost (Score:2)
FOG [fogproject.org] can back up disk images and do inventory and scheduled re-images and keep images for things like AV scan and whatnot.
We mostly just use it to wipe machines back to clean slates ourselves but it's supposed to have quite a rich feature set. It does also have some bugs though.
I set one up at my school that has re-imaged 1000+ student netbooks 7 at a time, the school district also has all of our images centrally located and available throughout the district for all models of laptops/desktops that we use and
Amanda and S3 (Score:2, Informative)
Use AMANDA [sourceforge.net] to do the back-ups. Use Amazon's S3 to actually store the dumps compressed and encrypted at the source — AMANDA has had the S3 back-end [zmanda.com] for a while. No, you do not need "Amanda Enterprise".
Having set just such a thing up at my last job, I'd be happy to help you out for a regular consulting fee. Should not take more than a week or two even on a large organization.
Don't back up end user PC's at all. (Score:5, Insightful)
Set up home drives on file servers and back those up. Teach users that those are the only locations that are backed up. Set up the PCs to use that as the default home location. You can do this on Windows and Linux just fine. Invest in the server -- redundant power supplies, RAID arrays, failover, etc. You could even look at various open source NAS devices, or whatever works for your environment.
Why?
Backing up user PCs doesn't scale well and becomes a thankless task for some poor employee who has to keep up with broken backup clients. It's far easier to scale when you only have to keep up with the file servers. You have some number of clients saving to each server, but that's that number of backup clients you don't need to deal with. This frees up IT staff for other, more useful tasks.
It also allows you to replace end-user PCs with a simple re-image rather than trying to recover or fix anything. End-user calls and says their PC is going whacko, you pull a spare off the shelf and lay down a fresh install. Show up, take the malfunctioning equipment away and diagnose it on your time, while they get back to work. Since all the files are on the server they can just get back to it rather than waiting on you to try and fix whatever might be going wrong.
Re: (Score:2)
I agree with this but I can't tell you the number of times, even where reasonably well implemented, it resulted in A User of Some Organizational Clout losing some data they wasn't saved where it belonged. Mostly this was ultimately consequence free, but getting to that state often required some kind of extensive recovery.
It's also iffy for mobile users, or at least was kind of iffy prior to the proliferation of cloud sync options.
I kind of wish you could do some kind of automated desktop backup of the user
Re: (Score:2)
Fortunately we have buy-in from the top on this strategy, and they also eat their own dogfood. So people bitching about it can complain all the way to the CEO and he will say "too damn bad, that's how we do things, and you were told about it on multiple occasions"
For mobile users there are far better options available now than there ever used to be. For Windows mobile clients we implemented DirectAccess and some other stuff that automatically takes care of a lot of the syncing stuff. We also keep all email
CrashPlan (Score:2)
Can be used entirely for free and encrypted backups. The CrashPlan makes money with their cloud backup features. You can back machines up to a single machine or multiple machines on your network that have CrashPlan installed one them. You can run in a scheduled mode or do realtime (every 15 minutes?) backups.
Re: (Score:2)
Sorry, not opensource but free. :(
First off, store most data on servers (Score:1)
As much as is feasible, store files on the servers you have already.
I realize this may not be feasible if your "daytime bandwidth" or latency makes it impossible, but do it if you can.
I'll leave it up to others who know more than I do to answer your original question about open-source, centrally-managed, business-grade (read: vendor-supported and hack-resistant) solutions.
Oh, one more thing: this is a business. Unless you are going to dedicate a programming team to bug-fixing this and a security team to r
Did you even do any basic research? (Score:1)
Rsync Script + Cron job (Score:2)
Rsync is simple, lightweight, has been around forever, and gives you incredible power. Assuming by "manage centrally from a console" you mean that you have remote admin access to all the computers in the scope, it's as simple as a cron job running your Rsync script. You can trivially make several versions for different use cases (Linux vs. PC) and only have to configure the setup once in the cron job. After that, you only need to touch it if you make changes.
Rsync can push deltas to any remote server you
Re: (Score:2)
Is there any free rsync SERVER capable of running under 64-bit Windows 7 Pro and working with any known Android rsync client over wifi (to an ext2-formatted hard drive mounted using ext2fsd)?
What about ZFS? (Score:1)
I know that is heresy to traditional data-management methodology; but with ZFS' resilvering and anti-bit-rot self-healing capabilities, it would seem that, other than a fire or tornado hitting the server closet, or outright theft, that ZFS totally answers the need for traditional backup. And if you combine that with incremental backup to an offsite data-store (also ZFS?), then how wonderful would that be?
I admit my interest in ZFS FAR ou
Re: (Score:1)
ZFS != backups, just like RAID != backups. It provides no protection against ransomware or someone doing a "rm-rf /".
[snip]
You basically have two choices: An enterprise level backup utility, or cobble together a solution and pray it works. If budget prevents a proven solution, I'll go with the amateur night stuff... but I'll make sure to have at least two different ways of backing up machines, so when (not if) one fails, I still am OK... and I also run test restores on a monthly basis just to check for hidden corruption of stuff.
I wholeheartedly agree with your statement that ZFS != backups (and even more so with RAID, in my bitter experience!)
However, I worry (probably too much) about bit-rot, and the problem of overwriting a good Backup with a bit-rotted Original. I was hoping that ZFS would at least help prevent the bit-rot from spoiling the Original.
I also agree with you that it's NOT the "Backup" that's critical, it's the RESTORE. That's why I won't mess with ANY Backup software that creates Proprietary backup "packages".
rsync? (Score:2)
rsync and a few scripts. Perl scripts.
Re: (Score:1)
Re: (Score:1)
Backuppc is exceptional for linux, but for windows and the infamous pst files, better to have a client/server architecture of the BURP software.
Same server, running backuppc and burp, and you save to tape the whole backup FS every 6 months.
You can use bacula for the last step if you so wish, we use straight copy to lto tapes.
A non-open-source solution... (Score:2)
Look, I know that "open source solution" is in the title. The low hanging fruit is already camping out in the thread - Bacula, Clonezilla, and script/cron/rsync are the major solutions there.
If the business is okay with "free, even for commercial use", Veeam Endpoint Backup [veeam.com] is excellent. It will either back up to a Samba share or a Veeam B&R if you have one in the environment somewhere. It's legit freeware, and works very well.
Even if not for this particular case, it works well for laptops. It's the onl
It Depends... (Score:2)
As usual.
My company uses Bacula for all our server backups, and it works pretty well, once you beat the configuration into doing what you want.
Some things about Bacula that I've noticed:
1) It's scheduling is more than little rigid. I'm not using it on desktop PCs for that reason (the PC pretty much needs to be there when Bacula wants it to be, or you miss that backup cycle. As near as I can tell, anyway).
2) Trying to configure the retention times for Bacula is NOT for the faint of heart. Get someone to help
Powershell (Score:2)
I'll just throw this out there: I was tasked with the same requirement of backing up people's desktops 6-7 years ago and the solution that I went with was some home-grown Powershell scripts and using the built-in VSS service on the workstations.
I grab the bare necessary files to rebuild a workstation and then dump the backups to the user's home directory on the server (which is then automatically backed up). Take a look at the scripts I wrote [thebutlerfamily.org]
Bacula (Score:1)
UrBackup can do both files backup and image backup (Score:1)
How big is "big"? (Score:2)
The first thing I have to say to everyone who asks me to design a backup solution is "what's your recovery solution? what are your recovery needs?". Then design your backups around that. Don't back up anything that won't be restored. You have to protect against both disaster recovery (loss of total system) and operational recovery (file deletion, corruption, historical trails). DR for a PC is usually from a stock image; OR for a PC can be managed through much better methods than PC backup.
The only thin
Re: (Score:2)
I think it is about time they 'automatized' your job since it doesn't sound like you know what the fuck you are doing....
Why is this down-modded? It's exactly what I was going to say. If I ran a "big company" I wouldn't be expecting my IT people to be asking for advice on a public internet forum.
And "automatized" really isn't a word.
Re: (Score:1)
Down-modded probably because it makes too many assumptions. The question may not even be asked by "IT people". Abusive too, so fair game for -1 IMHO
Re: (Score:3)
Seconding this. In the companies I work with, this is the solution we've put into place. Windows PCs use mapped network drives for personal folders and shared folders to a server. The server runs ZFS as its file system. Simple cron scripts on the server itself automate the process of creating snapshots and doing send/receive with other servers both inside and outside the building. These additional machines also store a certain number of snapshots, so we can recover previous versions of files easily. ZFS + S