Syncing Options for Computer Lab Machines? 60
sirfunk asks: "I'm going to begin helping out maintaining the computer labs around my university campus. I was wondering what solutions the Slashdot community had hints and tips for maintaining computer lab networks. We need a solution where we can keep a remote image on a server, and the computers will update to that on bootup. We also need them to be able to update, even if Windows is severely messed up (so if Windows dies, just reboot it). I know there's commercial solutions like Deep Freeze, but I was hoping someone knew of a creative Open Source alternative. I'd love if we could run these as dumb terminals with *nix, however that won't be an option for the general public. One Idea I had was to make the machines boot into a Linux partition that would rsync a FAT filesystem (the update) and then reboot to that FAT filesystem. The whole thing about getting it to boot into Linux first and then Windows next might be tricky. I would love to hear everyone's ideas on this topic. If you have any ideas that would run cross-platform (Mac/Windows) that would be great, too."
You only need RDP terminals (Score:2, Interesting)
For a lab, you may even be able to get volume pricing.
easyeverything (Score:1, Interesting)
why image? (Score:3, Interesting)
In my labs we just deploy the machine and update the software remotely as needed. Sure, we should redeploy once or twice a year to clear out the cruft that builds up ove a semester. But I think it beats re-imaging on every boot.
A good question is how much are you imaging? That could save some time.
Of course, that's just my opinion I could be wrong.
What are these machines doing? (Score:2, Interesting)
Don't do this (Score:3, Interesting)
There were two different setups, and I can't tell you what software they used to achieve them, but I can tell you what happened from a user's perspective.
In the first setup (a small lab -- about 20 machines), the machines were setup to automatically replace their installation of Windows once a week at a "convenient time". The problem was, this time was convenient for the sys admins, rather than the users. So, when working on a project out of scheduled lab times, I would often have to wait for about 30 mins to start work while the machine got a fresh copy of Windows. This was even worse if there was more than one person trying to use the machine, as the network would slow down.
The obvious solution to the above problem is to change the time to something like 3am. However, in these days of devastating Windows worms, I don't think it's an option to install a new image once a week. Also, many university computer facilities are open 24/7; you often get students who like to work antisocial hours, so choosing a convenient time is pretty difficult.
The second setup was a more campus-wide solution. I'm not sure how they achieved it, but it seemed that each machine maintained a log of which files were changed while a particular user was logged on. When they logged off, the machine simply returned the disk to the state it had been in before.
There are many problems with doing what you suggest:
+ User ignorance: naive users are used to saving their stuff to C:. If you then overwrite the disk, they will complain about your policy eating their homework.
+ If you have one 'master' disk image, how do you manage the different drivers required for different hardware? It's impossible to maintain a large number of systems with exactly the same hardware (when you consider component failures etc).
I would suggest the following: Use the permissions and management facilities of the OS to prevent users installing their own software or writing to the C: drive etc. Really lock them down. Give each user networked disk space which only they can write to. Make sure that you have an automated way to roll out patches, and keep on top of things. Make sure your virus protection is tip-top. Try to reduce the possibility of students infecting systems via removable media (I'd outlaw floppy disks, but students still use these!).
Further, for each "group" who need to work together (e.g. small groups of final year students who are working on a particularly project), provide a "transfer"area which they can all read and write. For users who need to install their own software (e.g. computer science researchers), establish a small team of sys admins at their location and let them do their own thing -- just make sure they are sufficiently safe behind a firewall so they can't easily shoot themselves in the foot and your managed main network is safe from any of their screw-ups.
Re:Not Windows, but Linux... (Partimage(d)) (Score:2, Interesting)
Complete picture:
+ PC boots, loads linux from network (PXE boot)
+ Linux does an fdisk, start partimage and restores original image
Overnight reinstalls are necessary 'cause we want to give students total freedom on the machine
Two problems:
1) found no way to boot windows from a running Linux so far. Temporary solution is a reboot, having the DHCP server change it's PXE options OR alter the file that decides whether a Linux networkboot or
2) Partimaged, the serverside program that offers the images on the network, is pretty much crippled by its limitation to 10 (15 now?) simultaneous clients. This makes it impossible to update all PC's in a single room (up to 30) at once, even though the server capacity is up to it. Tried to run multiple instances of partimage on different listening ports, but this crashes partimaged...
Improvement would be a good thing here, so I'll be watching this thread closely.
OS X Server (Score:4, Interesting)
This probably won't be able to apply to you, but it's worth knowing: Mac OS X Server can do this out of the box (to Mac clients). Apple calls it "NetBoot", and it's been available since at least 2000; I believe the tech came from NeXT originally.
Under OS 9 and 10.3 it allows for clients-without-drives as they get all their OS etc from the server down the wire (10.1,
What I've seen... (Score:2, Interesting)
Rebuilding the PC downloaded an image from a central server and re-imaged C:
If, however, the menu system noticed the time was after 1:00am and the PC hadn't been rebuilt for 24hours, it would force a rebuild, cleaning up any left over problems.
The system was enforced by removing the Logoff option from windows, requiring users to reboot after a session.
The only problem was, as mentioned above, that if you're working an all nighter on your project, forgetting to save then, when Windows crashes at 5am the rebuild will begin and wipe out your crucial temp files.
I suppose the solutions to this are A) put $TEMP on D:, a non-imaged partition for general junk or B) only re-image if the PC has been idle for a set amount of time (e.g. hold at the menu system for an hour then re-image)