Thin Clients in a Computer Lab Environment? 377
chachi8 asks: "I work as a lab administrator in a university, and I currently look after about 500 Windows-based PCs spread out over 20 locations. The IT administration at my school has recently (and quite suddenly) decided that thin clients are a direction we should be pursuing, and I've been doing some research over the past few weeks. We've recently been visited by representatives of Citrix who basically showed us some really impressive software that is far from cheap. Because we're a university facing budget cuts, cost is a major issue for us, so what I'm interested in knowing is whether anyone has implemented a thin-client solution in a computer lab environment, and whether it turned out to be cost effective over a 3-5 year timeframe. Clearly, the idea of being able to add an extra few years to the lives of our lab PCs is very attractive, as is the thought of being able to centrally administer the software in all of our labs, but I'm as yet unclear as to whether the costs of servers and licensing (and everything else) will really result in a long-term savings in money."
PCs are cheap, software isn't (Score:4, Interesting)
For ease of administration, use ghost to create disk images for each PC configuration. Something goes wrong? Wipe the PC and restore the image.
The thing is that hardware is getting cheaper by the day, software isn't.
-josh
CIS at Ohio State (Score:4, Interesting)
I still am in awe of what our CIS department [ohio-state.edu] has done at Ohio State. They handle something like 200 thin clients, plus all the remote sessions.
Basically, you sit down at an old, stripped down HP-UX machine or a thin client that allows you to log into one of their servers. NT and Solaris are the typical flavors--I can't remember what the other option was. Plus, if you log into the Solaris box, you can open a Citrix client and use that to be logged into an NT server. This is really nice for writing code in UNIX land, but using MS Office for the documentation.
I would just love it if the EE department could get a clue and do something similar. It really would give us the best of all worlds. Oh, and you can read more about the CIS setup here. [ohio-state.edu]
Cluster for Term Server (Score:2, Interesting)
Re:Get reference sites (Score:2, Interesting)
Check out Largo, FL (Score:3, Interesting)
It's been detailed by all the Linux rags (including Slashdot). Last I heard Citrix and Microsoft paid them a (friendly) visit, but they're still running Linux.
Re:Citrix... (Score:2, Interesting)
From a purely technical standpoint, you are correct.
Although, I believe Terminal Services is only licensed for access from Windows-based clients. I realize that this isn't a factor for your average Slashdot reader playing around in his parent's basement, license compliance is a major issue for the corporate/business/educational worlds... Like it or not, violating the license in that way is no different from simply pirating the software to begin with.
Please don't flame me, I'm not trying to stick up for them, nor do I care for this myself, but simply to point out cold, hard business details.
Re:My experiences (Score:3, Interesting)
The question of server software of course depends on what applications your users will need. Many school labs use PCs as glorified type writers and web browsers. This is quite wasteful in my opinion. There are several server packages that may be of interest, many have already been mentioned:
Citrix (For your Win Apps), RDP (if you just love MS), Citrix CDS (free, missing some functionallity like load balancing and client drive mapping, but largley functional),
Tarantella, and of course XDM, *nix terminals.
Many thin clients (particularly *nix based ones) are capable of connecting to all of these types of servers and more. I use thin clients dayly for all sorts of "lab-like" activities: email, word processing, web browsing,
Because thin clients depend on the network and servers for running applications, a fast network is quite desirable. This also makes then inappropriate for some tasks; basically anything that is graphics intensive. That is unless you are running that app locally (i.e. your thin client comes with a web browser).
If you are really cost conscious, try turning those existing PCs into thin client devices by running LTSP or simply installing linux, and limiting the applications available to the users. Unfortunatly this does provide you with all of the management advantages of a thin client.
Another option would be to buy thin clients (definately recycle your existing monitors) and use your PCs are nodes in a server farm. These can be running Citrix or just doing simple XDM load balancing.
In general I think that the combonation of thin clients and well maintained servers is perfect for 80% of the computer labs found in schools. If you like the idea of thin clients there are really many ways to proceed. The best starting point is simply to define which labs need access to what applications. Then decide which of those you want to run on your servers (some are more CPU intesive than others: web browswers). Then find a client (or roll your own) that has just enough CPU to run it's local apps and connect to your servers.
Good luck.
Anybody maintaining more than a handful should! (Score:2, Interesting)
We've been running thin and ultrathin client for both scientists, administrators and computer labs for a few years now. Though bumpy at times, it's been a great ride.
The thin clients are computers that are both rugged, relatively tamper- and theft-proff as well as inexpensive: iMacs. Starting early 1999, we've netbooted a little over a hundred of them using a few G3 servers, each with five 100Mbit interfaces serving an equal number of (relatively) small subnets. Additionally to netbooting, they also managed the personalization of the user environment, restoring user preferences upon login, and storing them upon logout. User files live on fileservers, that are mounted automatically upon login. Finally, transparently available (Unix) compute servers and - for the rare cases in which there's no other way - application hosting for legacy Windos applications using HOBlink, Citrix, rdesktop, VNC (or whatever) is available.
There's zero software-maintenance per desktop. Zero. Additionally, reliability is high and reproducability is absolute: there can be no difference between desktops, period. So system administration can focus on important things instead of wasting time on per-desktop troubleshooting, restoration, maintenance procedures and so on. Software installation is centralized, and a rigourous quality testing strategy can be enforced.
For those traditionally stuck in X11, it's provided by ultrathin Sun Ray clients. Absolutely silent, zero maintenance, excellent performance and absolute reliability. Of course, you need to devote Quality Time[tm] to create a Unix environment that's complete, and of recognizable higher quality than users get by performing well-meant tinkering on a local file system. If you do, that maintenance nightmare is gone too
It pays. With very little manpower, you can not only keep things running smoothly: all time is spent on improving and staying abreast of developments.
Boxes are cheap. Maintenance is expensive. Troubleshooting is expensive. Lack of a consistent quality is expensive. The time users waste trying create a working, fully functional environment for themselves and maintaining it is an expensive loss. Invest a little, gain a lot!
That being said, anyone's free to silently (and not very visibly) spend huge amounts of resources on per-desktop maintenance, whether by admins or by users themselves. I know I wouldn't, ever again.
Re:Take a look at Tarantella (Score:3, Interesting)
Tarantella is a really great product! Not only can it serve up Windows applications, X applications, TN3270/5250 access, telnet, and custom things (e.g. we did SSH), but it also can be used to conserve network bandwidth. It also supports browser-based X.509 authentication (to trusted web server) for single-signon like access, in addition to SecureID.
Java applets can be accessed via the web versus native clients as well, which should make the job of "upgrading" 500 PC's (ala Citrix) a thing of the past. This works nicely in a true thin hardware environment, e.g. Sun Ray.
Tarantella also integrates well in a portal environment.
If you're serious about taking control over large numbers of deployed PC/workstation, applications, and network bandwidth, I'd check out what these folks have to offer!
MS-Terminal Server is the problem (Score:2, Interesting)
Here's a reference (Score:3, Interesting)
The server holds an image for each hardware configuration, since we only have a total of 4 video/nic combinations. The server is a PII-300 with 128MB of RAM, and a 9GB SCSI HD. We had this box and another identical box laying around. I have them set up so that one can take the other's place if it were to go down (it never has). We have 40 clients using thin clients in this way.
This was an interim measure because we didn't have the money to purchase new hardware last year.
Recently we started replacing the old desktops with Wyse thin clients, which run a proprietary operating system in ROM, and come with a USB keyboard and mouse (without wheel) for $300. I set up an FTP server for them to retrieve firmware updates from.
But back on the Linux thin clients. They boot very quickly compared to Windows, and present a Windows 2000 logon screen. Ctrl-Alt-Del and the Windows key work exactly as expected. The only drawback (the way that _I_ put it together) is that the thin clients don't have any unique configuration, such as screen resolution. But there are ways to get around that (but I haven't needed to).
Gawd. When I get time, I will document this project on my site. If you'd like more info, email me at robin d0t daugherty a+ ovf d0t com.
Citrix thin clients can work... (Score:1, Interesting)