Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Software Upgrades Technology

Subversion as Automatic Software Upgrade Service? 41

angel'o'sphere asks: "I'm working on a contract where the customer wants a automated, Internet-based check-for-updates, update and install system. So far we've considered a Subversion based solution. The numbers are: a typical upgrade is about 10MB in size. Usually it's about 30 to 50 new files (which have an average size of about 200kB) and 2 database files (which can be anywhere from 500MB to 2GB) that change regularly. Upgrades are released about every 3 months, and this will probably become more frequent as the system matures. The big files are the problem as we estimate about 100-300 changes in every file. The total user base is currently 2000 users, creeping up to probably 5000 over the next year, and might be finally end up at some 30,000 users. Any suggestions from the crowd about setting up a meaningful test environment? How about calculating the estimated throughput of our server farm? Does anyone know of projects that have tried something similar using an RCS or a configuration management system?"
"We want to support as many concurrent users as possible (bandwith is not an issue). We use an Apache front end as a load balancer and as many Subversion servers as necessary on the backend. My largest worry, from my calculations, is disk access on the Subversion server. We could not run meaningful tests, because a typical PC kills itself if you try to run more than 4 or 5 parallel Subversion clients doing an upgrade (due to insanely high disk IO, and high seek times)."
This discussion has been archived. No new comments can be posted.

Subversion as Automatic Software Upgrade Service?

Comments Filter:
  • by Fweeky ( 41046 ) on Friday September 16, 2005 @06:14PM (#13580773) Homepage
    This is the technique used by portsnap [daemonology.net]; basically you generate binary diffs from a known starting point, and the client keeps track of what new patches it needs to keep in sync. Since you're just serving static files, scaling it should be as easy and cheap as it gets.

    rsync is highly general purpose; your servers will end up generating hashes for every n-bytes of every file for every client, which is a lot more heavyweight than just serving patches you generate once. SubVersion may be more effecient since it should know something about the files it's checked out previously, but it's still going to end up dynamically generating diffs between whatever versions each client has and the latest; this likely gets worse if your clients aren't tracking HEAD.

    Also note that a custom solution can likely get away with a single tag file detailing the latest patches; rsync and svn are going to be scanning their directory trees religiously. Both you and your users will probably appreciate a single GET to a small file on a webserver than a load of CPU use and disk thrashing.
  • by NuShrike ( 561140 ) on Sunday September 18, 2005 @01:39AM (#13588225)
    Here's a combination of available strategies:

    o DON'T use SVN (imo)
    o check out your latest rev to a staging 'folder'
    o rename your previous release 'folder' to backup name
    o rsync the data from your staging 'folder' to all your clients one by one.

    If you have issues with the release, just roll back to the previous release 'folder'.

    There other thought is to use rsync a .torrent file and use something like bittornado to distribute from your 'staging' folder.

    All this should let you get by with a 1GB or less ram master file server, and crappy i/o too.

    You figure out a security-scheme to wrap around this.

One man's constant is another man's variable. -- A.J. Perlis

Working...