Please create an account to participate in the Slashdot moderation system


Forgot your password?
Data Storage

Laptop/Server Data Synchronization? 305

gbr writes "I've been trying to automatically synchronize data between a laptop and a server. When the laptop is connected to the network, I want all writes to automatically propagate across to the server. When the laptop is disconnected I want the laptop user to continue working with the local data. When the laptop is reconnected, I want the data to automatically re-sync. The issue is, the data on the server may have changed as well, which needs to propagate back to the laptop. The data doesn't contain anything too special, no database tables etc. It does contain binary data such as executables and word processing documents. I've looked at ChironFS, Unison file sync, and drbd. ChironFS needs a manual rebuild if a connection fails, and the user needs to know which machine contains the correct data. Unison requires the user to initiate the synchronization process manually every time, and drbd is just not meant for the job at hand. How do you automatically, and invisibly to the user (except in the case of conflicts), synchronize between a laptop and a server?"
This discussion has been archived. No new comments can be posted.

Laptop/Server Data Synchronization?

Comments Filter:
  • rsync (Score:5, Informative)

    by jshriverWVU ( 810740 ) on Monday August 27, 2007 @11:48PM (#20380083)
    I do this often and rsync is wonderful for such a task.
    • Re:rsync (Score:5, Informative)

      by pixel.jonah ( 182967 ) on Tuesday August 28, 2007 @12:02AM (#20380177)
      I'd second rsync.

      I'd also take a look at Microsoft's SyncToy [] if you're on win***s.
    • by wpanderson ( 67273 ) on Tuesday August 28, 2007 @12:07AM (#20380203)

      My Briefcase in Windows 95. It even has a cute ickle briefcase icon.

      Somewhat seriously, Offline Files in Win2K/XP is something I've yet to see done well on any other OS.

      • by arivanov ( 12034 ) on Tuesday August 28, 2007 @02:07AM (#20380891) Homepage
        Offline most likely derives its origins from Coda which was designed to work for 100MB at most. It seems to inherit all of its problems when the data volumes become big. I have had to support an environment where people casually offlined 3-4GB documentation trees and it was falling over on regular basis.

        Further to this, offline files has a number of fairly fundamental bugs in the actual implementation. It records both the IP and the name of the server somewhere when doing the offlining. As a result if the name (but not the drive) or the IP changes your entire offline tree goes south and stays offline. You can neither delete it nor reconnect it and the only way of dealing with this is either surgery to the network (aliasing IP addresses) until you reconnect. The only alternative is to rebuild the affected laptops from scratch.
    • Re: (Score:2, Informative)

      I agree, subversion & scripts sounds like a good solution. Dare I suggest... Lotus Notes/Domino? Seriously, the notes databases can be setup to do what you're looking for. It worked pretty good at my last job as long as everyone played nice.
      • Re:rsync (Score:5, Insightful)

        by cduffy ( 652 ) <> on Tuesday August 28, 2007 @01:40AM (#20380789)
        subversion is intended for a case where you have a single data store. A modern distributed SCM -- designed for disconnected use -- would make more sense.

        Personally, I play with bzr most frequently; it has a nifty Python interface (and an extensive plugin architecture) which makes it quite conducive to local scriptage. (As an example -- I have a local, filesystem-backed set of CA scripts which use bzr for transactional semantics; if a method is called which throws an exception, all filesystem changes are automatically rolled back; if a method succeeds, a commit is done to record the operation and [effectively] set a rollback point). The separation between working tree and repository is optional (by default, all working trees are also repositories -- much like BitKeeper in that respect), which makes it very handy for situations like this where you don't necessarily *have* a separate, central location which all nodes can always communicate with, and where the different trees are allowed and expected to temporarily diverge.
        • I agree—subversion isn't really the right tool for syncing data in two places. Having said that, if you want to keep revision history and sync it, then the latest subversion might be right up your alley. Included is a svnsync tool that is intended to sync a "live" repo (one that you use regularly for commits) with a backup repo (exists solely as a copy).

          As far as automating backup, assuming you're talking about linux, I would say that the best approach is to simply write a short script that pings f

    • Re: (Score:2, Informative)

      by eos_buddy ( 1148457 )
      If the laptop and the server you use is Linux/Unix, rsync is definitely the answer to it - its robust and after the 1st time of rsyncing, the process should be quick enough (assuming you log-on often). I wrote a small script recently to sync my firefox bookmarks. Don't know whether it might be helpful to you, but here is the link: []
    • Re:rsync (Score:5, Informative)

      by Roarkk ( 303058 ) * on Tuesday August 28, 2007 @12:19AM (#20380311) Homepage Journal
      rsync is part of the answer. If you're looking for a way to have multiple, incremental backups of laptops with unpredicatable patterns of connecting to the network, BackupPC [] is the way to go.

      BackupPC is a high-performance, enterprise-grade system for backing up Linux and WinXX PCs and laptops to a server's disk. BackupPC is highly configurable and easy to install and maintain. Given the ever decreasing cost of disks and raid systems, it is now practical and cost effective to backup a large number of machines onto a server's local disk or network storage. This is what BackupPC does. For some sites, this might be the complete backup solution. For other sites, additional permanent archives could be created by periodically backing up the server to tape. A variety of Open Source systems are available for doing backup to tape. BackupPC is written in Perl and extracts backup data via SMB using Samba, tar over ssh/rsh/nfs, or rsync. It is robust, reliable, well documented and freely available as Open Source on SourceForge.

      By using pooling and compression, one client of mine is using BackupPC to backup over 1TB of data distributed among over 100 laptops to a 200GB filesystem on a central server. The network is polled every hour, and any system that hasn't been backed up in the last 24 hours is queued. Beautiful system.
      • by dbIII ( 701233 )
        It looks good but will fall prey to the utterly stupid windows file locking problems that even NTbackup occasionally falls foul of. In other words - good for most things but if email is stored locally instead of elsewhere it is almost a certainty that you will never back it up. If the user keeps their email client open all the time as you would expect then the file is locked. One of the files that is very important to the user, the registry, is not going to be backed up. for this reason you really need
        • Re: (Score:2, Informative)

          by EvanED ( 569694 )
          The information on how to use this hack is not publicly available.

          Um... what?

          You mean besides this diagram [] of the steps you should follow when making a backup (and a similar one for restore), and the MSDN documentation [] for the VSS.
          • by dbIII ( 701233 )
            Looks like I was wrong. Does anyone know of an open source application that uses this? Cross platform solutions like bacula and amanda don't as yet.
        • Re: (Score:3, Interesting)

          by arivanov ( 12034 )
          There is no point implementing laptop backups before implementing a no-quota IMAP mail server. Exchange in its native mode does not count due to a number of corruption bugs which hit you once your inbox exceeds 2G (it should be OK as an IMAP server, bugs are mostly in Outlook).
          As far as the user is concerned his primary concern for laptop data loss is email. So you have to back it up as a part of any backup solution. If you are storing email locally on the laptop and backing it up the backup will nearly alw
        • by JayAEU ( 33022 )
          That's right, but at least for eMail stuck in an Outlook .pst file, you can install pfbackup.exe, which regularily copies the real thing into a backup copy. This offline copy can then easily be backed up by BackupPC.

          The same goes for many other files that have running services locking them. Just pop in a pre- and post-backup script in BackupPC that handles stopping and starting said services or have the services (i. e. RDBMS) do a nightly dump to a file.

          The beauty of BackupPC is that it will even only trans
      • It says BackupPC is *nix only. Can it be used with Cygwin on Windows?
        • by JayAEU ( 33022 )
          It probably can, but I wouldn't recommend it. Run BackupPC on Linux, which can back up Windows and Mac clients just fine using SMB/CIFS and/or rSync.
    • Re: (Score:3, Funny)

      by Anonymous Coward
      I keep trying to install the n'sync CD but it just asks me whether I want to play the music or not.
    • Re:rsync (Score:5, Informative)

      by frenetic3 ( 166950 ) <houston@al u m . m i> on Tuesday August 28, 2007 @01:02AM (#20380597) Homepage Journal
      Take a look at Dropbox ( []; screencast at []) if you want something that's rsync-like but integrated into Windows and OS X. It's in beta (and full disclosure: I co-founded the company) but was designed precisely because there's nothing out there that does this well and is easy to use.
      • Re:rsync (Score:5, Interesting)

        by frenetic3 ( 166950 ) <houston@al u m . m i> on Tuesday August 28, 2007 @03:27AM (#20381279) Homepage Journal
        Apologies if it's in bad taste to reply to my own post, especially because it's about the product I'm working on, but here are some of Dropbox's differences/improvements over what people typically hack together themselves:

        - syncs continuously/watches the FS for file changes (no cron jobs needed -- things usually sync as quickly as they can be sent)
        - does binary diffing and only sends deltas (compressed & over SSL)
        - transparently archives past versions of all files (i.e. undelete/infinite undo)
        - syncs across any number of machines
        - lets you get to your files from the web
        - some more info @ -list/ []

        We made it after hacking together our own rsync-based abominations and getting really annoyed that no one had solved this genre of problems in a way that normal people could use.

        Okay, I can stop shilling now. I was just excited that other people run into these problems.
      • Re: (Score:3, Interesting)

        This looks quite similar to Novell's iFolder but with you running the server yourselves instead of having your users set up an iFolder server. Last I used iFolder was in the 1.x or 2.x days and it frankly wasn't anywhere near the polished product you have here. Now it seems that iFolder 3.x [] is open source and looks a lot more polished.

        Still, I think you surely have a great service market here even though the polished front-end app seems to now be done open source. Best of luck to you on your new ventur

    • Re:rsync (Score:4, Informative)

      by syphax ( 189065 ) on Tuesday August 28, 2007 @01:23AM (#20380725) Journal

      Unison [] is 2-way rsync. But as the poster noted, unison/rsync doesn't easily support automatic synching (that I know of)- you have to kick it off and then deal with any conflicts, etc., manually. I think the poster is looking for ideas of at least automating Unison/rsync (BTW does rsync support 2-way updating, as the poster explicitly mentions?).

      As someone who relies on running unison manually (too lazy to figure out how to automate on my Windows box), I'd be interested in relevant solutions.
      • Unison, Rsync & NTP (Score:4, Informative)

        by Colin Smith ( 2679 ) on Tuesday August 28, 2007 @04:22AM (#20381497)
        Unison can be scripted, added to a login script. As can rsync on windows. Alternatively you can add a polling batch file which wakes up every so often and checks to see if the server lives. (Yes, even on Windows)

        Rsync can sync in both directions, but you decide one of the sides is the master and sync that one first, in the case of conflicts the master rules. It isn't possible to choose on a file by file basis at sync time as you can with Unison.

        Oh, and NTP is absolutely vital when doing any synchronisation.

        Basically. Either you do it manually and manage conflicts at sync time, or you do it automatically and define one of the sides as a master in the case of conflict. There's really no way round this, software just isn't sophisticated enough to decide what you're thinking.

        The truth is that filesystem syncing isn't ideal for a very dynamically updated file system. It is best used on fairly static filesystems or one way syncing. Documentation, backups and the like.
    • Re: (Score:3, Informative)

      by dusty123 ( 538507 )
      I don't see how rsync solves this problem:

      AFAIK, rsync is only one-way, meaning that it overwrites and eventually deletes files. Have a try:

      mkdir d1 d2 # Create two directories (e.g. one on server, one on laptop)
      touch d1/foo.txt # Create an empty file
      rsync -r d1/ d2/ # Sync the directories
      echo "123" > d2/foo.txt # Now modify the file on d2 (e.g. laptop)
      rsync -r d1/ d2/ # Sync again
      cat d2/foo.txt # Ooops - foo.txt is empty!

      One possible way I experimented with is the following:

      - Integrate a rsync server -
      • Re: (Score:3, Informative)

        by Znork ( 31774 )
        "AFAIK, rsync is only one-way, meaning that it overwrites and eventually deletes files."

        rsync has a whole bunch of options that will let you decide behaviour. --update will make it skip files that have newer modify times or you could use --backup to make it make a copy of files instead of overwriting them, etc. Mix and match and run two-way syncs after eachother and you could get close in behaviour to a real two-way sync.

        "I thought about Coda but it seems to be far too complicated and unreliable and I don't
  • Subversion (Score:2, Informative)

    by hahafaha ( 844574 ) *
    How about Subversion? You'll have to write some small scripts that would make it autoconnect, etc., but it could work if you set up SSH keys, etc.
    • Re:Subversion (Score:5, Interesting)

      by Iron Condor ( 964856 ) on Tuesday August 28, 2007 @12:08AM (#20380219)

      At the risk of saying something stupid or blasphemous: why offer something that requires "writing some scripts"?

      If the OP wanted to "write some scripts" s/h/it could have done all the neccessary work with a couple foreach...cp...end. Or, hey, rsync.

      I am suspecting the OP is wondering whether there isn't something out there that "just kinda works" and only needs intervvention in case of a conflict.

      Knowing well that this will definitely be considered blasphemy: I've been using Window's "briefcase" system since Win98. It does "kinda work". Most of the time. And requires work when there's a conflict. Which appears to be what the OP is looking for. Given that the OP doesn't seem to want to just go that route, the question appears pertinent what s/h/it is looking for that Mr. Gates briefcases can't/won't do...

  • iFolder? (Score:4, Informative)

    by belly69 ( 1114799 ) on Monday August 27, 2007 @11:53PM (#20380113) Homepage
    That sounds exactly like what Novell's iFolder is made for: []
    • Re: (Score:2, Informative)

      by Anonymous Coward
      Agreed. I work for a little company with a few hundred on-the-road consultants who keep their lives (gigabytes of data) on their laptops and on desktops at work. The workstations are on when they are there and sometimes even when they're gone, and the laptops are on sporadically. When logged in everything synchronizes once per minute. If changes happen on the laptop they go to the server and, if the workstation, they also go the workstation. The synchronization is bidirectional and happens with the new
    • Re: (Score:2, Informative)

      by freckledp ( 1126451 )
      I used to work for Novell, traveling two or three times a month. My personal office had several computers and I had multiple laptops that traveled with me. Using either Windows or Linux clients of iFolder, I was able to easily sync files from one machine to another, always ensuring that I had the latest copy of whatever file I needed. It sounds like exactly what you need.

      One possible problem: you have to store the information in a folder (which you specify). Only the data in that folder is synced.


      • Re: (Score:3, Informative)

        by hendersj ( 720767 )
        iFolder3 lets you specify whatever folder you want - I sync about 5 or 6 folders and share them with different people in my department.

        (And if you're who I believe you are (CC), hey! Drop me a line...)
    • by Myself ( 57572 )
      If iFolder were mature, and if the server ran on anything sane or common, I'd agree with you. But for the moment, setting up an OpenSUSE VM on my XP box just to run one piece of software isn't what I call fun. It's a project worth watching, certainly.
  • Unison (Score:2, Informative)

    by graphicsguy ( 710710 )
    I use unison. Why would you need to run it manually every time? It can be run in batch mode. I am mostly using it for live backups these days rather than true bidirectional synchronization, so I could really use rsync and some scripts, but I've gotten pretty comfortable with unison.
  • my take (Score:5, Insightful)

    by TheRealMindChild ( 743925 ) on Monday August 27, 2007 @11:58PM (#20380155) Homepage Journal
    Man... You are late to the party. People have been struggling with this since the beginning of time (or so it seems). Especially database apps, where they need to work in "detached mode".

    I can't give you a flat out solution, because all situations are different. But I can pass on a bit of wisdom. The most important thing for you to do is create business rules for your synchronization. If the data on the server has changed and you made changes offline, who gets priority? You will have three categories of which a file can be... Client changes get priority, Server changes get priority, and Merge files. I would stay away from the last one. If you want to keep things simple, Id go for the "Server changes get priority" approach. In short, if you took an "online" file "offline" and came back, and the server copy has changed since, your offline edits are abandoned. This way, it makes it so heavily edited files have a shorter "check out window" (even if you don't use a checkout system), and forces the person taking the file offline to coordinate with everyone else that may edit this file.
  • common problem (Score:2, Insightful)

    by ILuvRamen ( 1026668 )

    When the laptop is reconnected, I want the data to automatically re-sync. The issue is, the data on the server may have changed as well, which needs to propagate back to the laptop.

    And there you have the problem with synchronization. There's no mind reader program (yet) so sorry but you're going to have to make up your mind about how to handle it when the server version changed too. Either find a way to merge the files or start making decisions about when they can get modified (ie a checkout system) or i

    • by Fex303 ( 557896 )

      There's no mind reader program (yet) so sorry but you're going to have to make up your mind about how to handle it when the server version changed too.

      Since the original questions said that the sync should work "invisibly to the user (except in the case of conflicts)", I'd guess they have given that some consideration. Besides, I can easily imagine a piece of software that is configurable to do any of the various options you suggested.

      Find some freeware or open source one that does just what you need and

    • by LS ( 57954 )
      Thanks, Captain Obvious! If you read the entire submission, he mentions in parentheses "(except in the case of conflicts)". He is aware of the problem. And thanks for the tip on using open source or free software over commercial software. That's a new one over here at Slashdot.

      Man, the mods are definitely asleep today.

    • That's like using Norton as your antivirus. Find some freeware or open source one that does just what you need and isn't overly complicated

      This is a common refrain, and I must say I find it puzzling. What about Photoshop for photo editing? Or Avid for video editing? Or Quicken for accounting? Where are the freeware versions of those applications? MS office applications happen to have free alternatives in OpenOffice and NeoOffice and so on, but only as a result of monumental interest in staging a pushb

      • Re: (Score:3, Informative)

        by oatworm ( 969674 )
        Not that it matters, but since you asked...

        Photoshop -> GIMP []
        Avid -> LIVES [] - Note: I am not a video editor and have no idea if this program is any good.
        Quicken -> GNUCash [], among others.

        I guess what I'm saying is that, based on your definition of "silly", there's quite a bit of silliness going on in the world today. *grin*
  • by aero2600-5 ( 797736 ) on Monday August 27, 2007 @11:59PM (#20380167)
    From the summary:

    "The issue is, the data on the server may have changed as well, which needs to propagate back to the laptop."

    So let me get this straight.. You have the old version of the file, somewhere. The new laptop version of file, somewhere. And the new server version of the file, somewhere. And you want the software to decide which to use and copy it to both the server and the laptop?

    There are even more issues here, but it kinda sounds like you want some artificial intelligence that you can download.

    • Unison (Score:2, Informative)

      by sgyver ( 1055348 )
      Maybe a two-way rsync tool made just for this purpose?
      You might have to do A-B, A-C, A-B type syncs for more than 2 paths, unless you stick to a hub/spoke or cascading distribution model.
      Not all conflicts are automatically resolved, by default. []

      Good luck.
    • by syphax ( 189065 )

      It's not that complicated. You have a directory structure on machine A with files and folders. On machine B, you have - wait for it - the same structure. SO synching is a matter of comparing file lists, hashes, last-modified timestamps, and last-synched timestamps. It only gets sticky when both files have changed since the last sync, which is an exception that usually merits human review to sort out which one to keep. If you have reasonable practices that minimize collisions, no big deal. Of course, i
      • No, there's another scenario that causes trouble too (and even more so), if just comparing file systems: A file that exists on one place but not the other does so either because it's new, in which case it should be replicated, or because it's been deleted on one of the two, in which case it should be deleted.

        The usual way of handling this is by comparing the timestamps on the directory too, and the one with the newer timestamp wins. This helps prevent new files from being deleted, but it isn't safe. Cons
        • by arivanov ( 12034 )

          The entire sync/remote fs approach is fundamentally wrong for any place that has more than 2 people working on the same project/material.

          What if you DO NOT WANT to resolve the conflict right away? What if the document that has been changed by two people is 200 pages long and the changes are substantial and you have 10 minutes to get the data and run? Even if you have the time there are plenty of other reasons why you may want to postpone the merge and review.

          The only real solution is to keep documents i

  • You'd have to do a diff between the sever and laptop versions of particular files, and the user would have to choose what edits to save or erase.

    Or, you could use a versioning system, making a new point-series branch whenever there was a difference between laptop and server versions. But you would still be left with the problem of choosing which edits to save and which to dismiss.

    In the end, you need humans to discuss the changes made and assimilate them into a new master document whenever there is a differ
  • If you want it to ask only on conflicts, you can write a wrapper script that runs in batch mode, greps the output for conflicts, and pops up the graphical one if there are any.
  • Coda (Score:5, Insightful)

    by norkakn ( 102380 ) on Tuesday August 28, 2007 @12:08AM (#20380211)
    • Re: (Score:3, Interesting)

      by gouldtj ( 21635 )

      I've always wanted Coda to work for this, but I haven't ever gotten it working. My current thinking is that I'd need to set up a server on my laptop, and then have the client talk to the local server. Then the two servers could sync.

      Does anyone have any information or case studies on how to make this work for a small network? Easy conversion tools? It seems like the ideal solution to me, but getting it to work seems difficult.


      • Re: (Score:2, Interesting)

        by EvanED ( 569694 )
        My current thinking is that I'd need to set up a server on my laptop, and then have the client talk to the local server.

        I haven't actually used Coda, though I'm planning on it for a small network myself, because this is exactly what it was designed to do. But why would you need the server on the laptop? All you should need is a client. Have you tried it that way and it didn't work or something?

        I do hear it's a pain to set up though.
  • by goofy183 ( 451746 ) on Tuesday August 28, 2007 @12:10AM (#20380231)
    I'll likely get buried but here it goes:

    In Windows you can mark a folder on a network share as "Available Offline". Windows will copy all of the files to the local HD and if the server isn't available just work with the local copies. When the server is detected Windows will automatically sync the files and pop-up asking the user about conflicts (keep local / keep remote). When connected writes automatically go to both the local copy and the server.

    One of the few places that Windows has right and I haven't found a Linux or OS X solution for that is nearly as nice.
    • by Techman83 ( 949264 ) on Tuesday August 28, 2007 @12:35AM (#20380403)
      A great solution till it breaks... believe me it does break and when it does be prepared for heartache. There are ways to recover it, but I think it assumes to much and the potential to screw up is a big risk. There were several users at my place of employment that found out this the hard way and now we ban the usage of it. It's not so much finding the best tool, but managing the process overall and how to do that.. Well we are still in the process of developing that one!
      • by weicco ( 645927 )

        Care to elaborate how did it broke up and how one can recover from such a situation?

      • I'll go with you that offline files is not perfect.

        But for it to 'continually' break or have problems as you suggest it would more likely be a situation where the infrastructure was not setup properly in the first place, with poor shares designs, trying to use Samba servers, etc...

        If properly setup, even in high volume business environments we have dealt with clients with 1,000s of users all utilitizing offline files and even offline profiles (including desktops, etc), and the fail rate is virtually 0.

    • I did that with a Dell laptop running Win2K in a docking station, with the "My Documents" folder mapped to my Win2K server directory. Worked marvelously.
    • Re: (Score:2, Interesting)

      by knitterb ( 103829 )
      Works well for me too. Sure, every now and then you get a mishap or a mistaken overwrite, but that's what [client and server] backups are for, right?

      I use it with my wife for our financial data. She syncs, makes changes, then next time I connect I can choose to use the server version or my version. Since there is no version control, you have to communicate. Then again, with the way version control works, if you end up merging a lot, perhaps too many people are working on the same problem anyhow.

      In the e
  • Foldershare (Score:2, Informative)

    by Offtopic ( 103557 )
    I've been using Foldershare for several months now to synchronize several folders on three different machines. It has worked well so far and it is free. It's available at: []
    • by JayAEU ( 33022 )
      Even though FolderShare only works for Windows, it's the perfect tool if you're on that platform. The good thing about it is that it can handle more than 2 PCs syncing, so you can have up to a dozen PCs synced for a certain directory.

      The only drawback is that the maximum number of files allowed per synced share is 10k. While this sounds like a lot, it really is not.
  • I'm not exactly sure what Apple uses under the hood to accomplish it. I don't think it's rsync, because I've fooled with the rsync built into OS X and I get errors frequently, but their home syncing works great.

    When you have a mobile user account (i.e. a network account with a local copy of the home folder on the workstation), it will sync every so often (frequency and exactly what is synced/skipped can be configured on the server end, and the user can kick it off manually from the client end). To the best of my knowledge, the sync is bidirectional, so if you log into another machine with a mobile account and modify the server copy, the changes will be reflected on the mobile copy at next sync. It makes my life easier because if a laptop user's machine gets lost, stolen, damaged, or destroyed, we've automatically got a backup copy of the data on it up to the last time it was synced.

    In the event of conflicts, the user is presented with a dialog asking which version to keep, including file size and modification date.

    Note that I'm not suggesting you throw out your existing hardware and buy Macs to get this feature, but maybe look into exactly how it's done on the Mac and see if you can duplicate it on your systems.

  • by shamborfosi ( 902021 ) on Tuesday August 28, 2007 @12:22AM (#20380347)
    I have OSX laptops using portable home directories to do exactly what you are asking for.. a network home directory that is automagically sync'd to my laptop (thus making it portable). It works both ways, and I'm definitely happy with it. I'm not sure which OS you're using though. I wrote about how to do it in an article: Full Stack: Portable Home Directory over NFS on OSX authenticated via OpenLDAP on Debian Linux [] if you're interested. I also just got everything to work over AFP to an OSX server running open directory as well.. but haven't had time to write it up yet (btw, a lot fewer steps).
  • Try coda (Score:2, Interesting)

    by blymn ( 621998 )
    Have a look at [] This is a disconnectable file system. It could be what you are looking for. Certainly, that is what I use for doing the same thing.
  • Can Collanos be used for arbitrary files?
  • commercial products. (Score:4, Interesting)

    by deviator ( 92787 ) <bdp@am[ ] ['nes' in gap]> on Tuesday August 28, 2007 @12:44AM (#20380465) Homepage
    Offline Folders on a Windows client connected to a Windows server work reasonably well but sometimes get screwed up.

    Novell's iFolder is a very interesting alternative... runs on Linux/Apache/Java stack & only transmits changed blocks over an SSL connection.

    Other things worth looking into include Microsoft Groove--let's you synchronize an entire workspace with yourself on other computers or other people - and is relatively network & environment-independent (though Windows only)

  • Synchronization Woes (Score:5, Interesting)

    by JWSmythe ( 446288 ) * <> on Tuesday August 28, 2007 @01:01AM (#20380593) Homepage Journal
    A few people hit this one pretty well. rsync (and probably rsyncd).

        The more complex problem has been thrown at me a few times. What if it's not just one person?

        Say you have a repository of data that a dozen people may be working in. When they're all network connected, they're all dealing with the same file pool. When they take their off-line copies with them (unplugged laptops on vacation), they all make changes to the same files. Maybe mine is a one line change. Maybe one guy copy&pasted the first 3 chapters from War And Peace into a comment somewhere in the middle. Maybe another developer did some very intellectual looking changes but hosed some major functionality.

        When you start putting machines back on the network, who is right? The 6 guys who did real work are obviously right(ish), but they all made different changes. The very last change will end up being someone's 3 year old kid who was pounding on the keyboard right before daddy shut down the laptop, saving the new changes. Probably the last is the most recent, and right by most methods.

        It's not a pretty picture, and requires some intelligence to sort out the mess.

        The only "good" resolution I've found is to give logical authority to the changes. Bob is in charge of development. Any changes going into the development or production tree must clear him. He should be able to recognize that the 6 guys made changes, and diff them to come up with the common changes. The 3 chapters of war and peace go by the way side. And the guy with the 3 year old "developer" gets reprimanded.

        In the end, a good revision system and good backups are needed too. Something will slip through the cracks, and you'll need to roll back to something you hope is good.

        I take control over whatever I'm working on, so if I know I'll be working offline, I'll scp the data to my laptop, work on it on the road, and scp my changes up to the server when I'm done. Anyone else who may have worked in my project space in the duration should have known better. :)

    • A few people hit this one pretty well. rsync (and probably rsyncd). The more complex problem has been thrown at me a few times. What if it's not just one person?
      How about subversion []? Multiple concurrent changes to arbitrary binary files schmoovely merged into one glorious hole.
  • by RAMMS+EIN ( 578166 ) on Tuesday August 28, 2007 @01:02AM (#20380601) Homepage Journal
    There have been some efforts in the area of networked filesystems with disconnected operations. I remember checking out AFS [], Coda [], and InterMezzo [] years ago. At the time, I found something wrong with each of them, but they may have improved since then. Of the three, I think Coda is your best bet.
  • The WinXP Synctoy is excellent. Otherwise, you need CVS to allow you to merge changes!
  • Not exactly what you're after but worth a mention anyway, I reckon:

    PathSync [] by Cockos (Justin Frankel of Winamp fame's new company).

    It has some automation features as well; haven't used them so can't vouch for them.
  • by coaxial ( 28297 ) on Tuesday August 28, 2007 @01:11AM (#20380653) Homepage
    Simply instally unison or rsync or whatever and have the job kick off with whereami [] for linux (you'll have to find the main page yourself) or marco polo [] for macs.

  • An iDisk set to sync does exactly this. Any work you do locally syncs in the background whenever there is a network connection.
  • SyncBackSE (Score:3, Informative)

    by __aalmrb3802 ( 533146 ) on Tuesday August 28, 2007 @02:18AM (#20380943)
    If you're running Windows, I would recommend SyncBackSE ( []), which I expect you should be able to setup to do exactly what you asked.
  • OpenAFS (Score:3, Informative)

    by sid77 ( 984944 ) on Tuesday August 28, 2007 @02:42AM (#20381041) Homepage
    As said before there're many choices, each ones with its own pros and cons, so I'll throw this one in: OpenAFS [].

    As read from the main page:

    What is AFS?

    AFS is a distributed filesystem product, pioneered at Carnegie Mellon University and supported and developed as a product by Transarc Corporation (now IBM Pittsburgh Labs). It offers a client-server architecture for federated file sharing and replicated read-only content distribution, providing location independence, scalability, security, and transparent migration capabilities. AFS is available for a broad range of heterogeneous systems including UNIX, Linux, MacOS X, and Microsoft Windows

    Hope this helps, ciao
  • In great *nix tradition, you should solve the problem by combinding many single purpose tools.
    Use something like [] to launch a script that runs Unison file sync when ever you are connected to the network.
  • If you run an OpenSolaris distribution on the laptop (the server can be any NFSv3 compliant server) then CacheFS will do exactly this for you.
    You can even prime the cache using cachefspack initially. It works in disconnected and connected mode and is automatic.

    An alternative tool on OpenSolaris distributions is filesync, it uses the same config syntax as cachefspack does but works by simply using
    cp/mv/rm/chmod/chown and doesn't include its own transport layer (so needs to use NFS or similar).
  • At least on more recent versions of Linux. I've been looking for a slightly different solution to filesystem synchronization (for webservers using iSCSI over ethernet to cache to local disk). Rsync would be a solution with a kernel that supported inotify, but we are using 4u5 systems so I'm left assessing expensive replication solutions or upgrading the whole system to RHEL5 with a tight deadline.
  • A sync might take some considerable time (even rsync is a few minutes with lots of files). What if the user needs to interrupt it.
    You probably should make this manual (at least via a confirmation dialog), so that if the user is only connecting for a few seconds, it does not try to sync. Also, the user can then control priority - he may want to get to the web first, rather than waiting while an Office service pack downloads...
  • There seems to be a project for Gnome called Conduit [] that sort of does what you want. I think. I haven't tried it.
    It does however look like a very ambitious project aimed at syncing both files and other data between both computers and other services. .haeger
  • I use two vista PC's connected on a network ones a Laptop the other a desktop. The laptop is setup to sync with the desktop through the sync centre, so whenever I log into the network with it it makes sure that the chosen directory contains the most recent files in both folders. works great for me
  • I've been trying to set up my Linux desktop with Windows-alike roaming profiles for ages now.

    I have the kerberos single sign on for a multitude of services. I have the LDAP user metadata. I have NFSv4 file shares secured with kerberos, and autofs homes that enable me to use the same app settings on each machine. I have a groupware server (Kolab) serving all of my users (all two of them, ha!). But can I find any way of accessing my home drive (reliably) on my laptop when it's not plugged into the network?

  • by hubertf ( 124995 )
    I use CVS to synchronize between several working places (home desktop, work desktop, notebook). It's not 100% transparent, but as you'll always have to push a "sync up/down!" button a "cvs commit/up" is as close as you can go.
    I guess any other version management tool can be used too, I just tend to know and like CVS.

      - Hubert

%DCL-MEM-BAD, bad memory VMS-F-PDGERS, pudding between the ears