Testing Pre-Production Servers Accurately? 48
An anonymous reader asks: "Having been granted a 90-day demo enclosure of new blade servers from a major vendor the question I find myself asking is: On a limited budget, how does one simulate 1000+ attached clients and the activity of those clients? We're a K-12 school district and our current servers don't keep up with all the roaming-profile abuse from our Windows workstations. Are there tools or tricks available to simulate load on Netware/Linux servers? The user groups around here usually answer this question with 'Get some workstations for a test lab!', there's got to be a less expensive option, right? Can we leverage our existing client populous to achieve our goal, without interrupting or changing the quality of service at the desktop, substantially?"
Don't even try (Score:1)
options but still expensive (Score:4, Informative)
Re:options but still expensive (Score:3, Interesting)
Re:options but still expensive (Score:1)
Absolutely!!! (Score:2, Funny)
Login Scripts (Score:5, Interesting)
Then when everyone logs in in the morning, it just overloads the thing...
And simulates the load spikes you see. I don't know a program or script that would do this, but I'm sure someone out there does.
JC
Re:Login Scripts (Score:2)
Re:Login Scripts (Score:1)
JC
Nights and weekends and holidays (Score:5, Informative)
You're gonna hate the answer, but this will give you a better test than anything else. Plug in your test system and get a bunch of the kids to help you out on a weekend. Have them do logins, logouts, play games, surf, write and save papers, etc. on throwaway accounts that go to the test server.
Write out a test plan -- how many clients, how many local, how many remote, how many do you start with, what is the step size (e.g., start with 5 clients, then 10, then 15, then 20, then 30, etc.). Profile your existing systems so that you know what's really creating the load on them. Is it really the roaming profiles or is it web site caching or is it something else? Good luck with it.
--Paul
This is probably a stupid question, but... (Score:2)
Re:This is probably a stupid question, but... (Score:3, Informative)
Re:This is probably a stupid question, but... (Score:2)
Thanks for the explanation!
yea, but there is hope for the wingeek (Score:1)
Re:This is probably a stupid question, but... (Score:5, Interesting)
Re:This is probably a stupid question, but... (Score:5, Informative)
Because most admins, > 60%, don't RTFM and therefore are unaware that the My Documents folder can be redirected through the use of Group Policy. Even more admins, > 80%, don't know that the IE cache directory should also be redirected to prevent multimeg profiles due to browser cache.
Of course, even those admins, that go to the trouble of doing these things and many more to minimized the profile size, are torpedoed by the users that save their downloads and documents to their desktops. I even saw an admin with a Windows 2000 service pack(>120MB) saved to his desktop while using roaming profiles.
The fact is that the roaming profile implementation is crap! Nothing should be copied over the network at login. Everything should be stored on a network drive and accessed directly from there. This is how Unix does it with remotely mounted Home directories. That way files don't cross the network if you don't access them. If you want to implement caching for slow links or disconnected access as Windows does, that's fine, but copying everything over the wire everytime someone logs in is just stupid.
Re:This is probably a stupid question, but... (Score:2)
Linux is pretty easy in that regard. anything beginning with a dot in home is PROBABLY a config.
Still, the smartest way is to disable roaming profiles, map the home to your desktop (as a link, mind you) on login. Seriously, wouldnt you complain if Linux mapped your home on another computer then proceeded to copy every file over. COurse, this is where consis
Re:This is probably a stupid question, but... (Score:2)
Re:This is probably a stupid question, but... (Score:2)
And Windows doesn't - it only copies *changes* (although, obviously, if it's the first time a user logs in the whole profile will be copied).
Re:This is probably a stupid question, but... (Score:2)
Re:This is probably a stupid question, but... (Score:2)
it is nice to be able to log on anywhere and have everything be exactly the same.
On the other hand, I do think the way in which microsoft handles this is a tad inefficent. I think novell offers a better solution to this, but I honestly don't have a clue.
Re:This is probably a stupid question, but... (Score:1)
Re:This is probably a stupid question, but... (Score:1)
Re:This is probably a stupid question, but... (Score:1)
In the registry create a value named DeleteRoamingCache of type REG_DWORD and set it to 1 under the HKEY_LOCAL_MACHINE\Software\Policies\Microsoft\Wi n dows\System registry subkey.
This doesn't solve the delays in copying the profiles around, but it does prevent them from filling up the hard drive. I particularly like this because it eli
Re:This is probably a stupid question, but... (Score:2)
It doesn't copy the whole profile unless the user is logging into that machine for the first time, it only copies changes.
Set it up on the network (Score:1)
If you have 90 days get the system setup to do everything on the network as fast as you can (you only have 90 days) set it up as if you where going to replace your current computers and then do just that, put it where it is suppose to go if you where to buy one. See the load it gets and log everything.
Your not part of a multi billion dollar company teachers and students can make it without the internet (i
Re:Set it up on the network (Score:2)
Kids these days.
Re:Set it up on the network (Score:1)
Why? (Score:2)
AP Computer Science students (Score:2)
If nothing else, put it on the network and invite them to use it as much as possible.
Re:AP Computer Science students (Score:2)
It's fascinating that they managed to still allow file sharing to work and let us access our folders, but still completely remove the TCP/IP stack from Windows.
This is also the reason they won't upgrade past Win98 -- you can't remove the IP stacks from 2000/XP.
As you can imagine, my school treats its students like criminals and is quite backward in many of its policies... they're scared to death that one of us is going to hij
Re:AP Computer Science students (Score:2)
Still uses IPX not IP unless configured otherwise.
Re:AP Computer Science students (Score:2)
Obvious solution? (Score:3, Funny)
-paul
Simple answer: don't use roaming profiles... (Score:1, Insightful)
Some thoughts on this kind of testing (Score:4, Interesting)
1. Come up with a (hopefully short) list of items that need to be observed under load. For example, authenticating users and allowing logins, serving up files).
2. For each testing item identified, find a repeatable, scriptable method of testing it. Perl will be your best friend here. For questions consult the great folks at perlmonks.com. You can find very nice modules that will allow you to create scripts that can authenticate to your test server, and report back a time. If you dump all your responses into a csv format, it will make the reporting of the data even easier.
3. Write a separate script to track system performance during the testing period. I couldn't actually tell from your question if the servers you are testing are windows or *nix , but whatever the os, you'll probably want a snapshot of overall CPU utilization and Run Queue size, swap use, disk access times, and memory consumption.
4. Be sure you are testing what you think you're testing! You're testing results will be invalid once you hit a limiting factor. So if you're network link gets overloaded by your test script, then your file transfer test is not a good one.
5. Design your testing scenarios carefully. It's probably best to have as short a test list as possible, and then make them as iterative as possible. For example, first test with having 1 users logon and transfer a file. Second test has 10 simultaneous users logon and transfer a file. Third test has 100 users involved, etc.
The key to all of this stuff, is you want tests that are repeatable, valid, and indicative of production uses. The nice thing about developing these test scripts is that you can use them to test hardware from several diferent vendors. If it seems daunting, just remember that it's better to spend a week of late nights developing the scripts - then to spend the next several months wondering if you're implementing hardware that will hold up to your production workload!
Linux? 1000 users? (Score:1, Troll)
You buy blades to serve 1000 users on Linux?
For most purposes like openldap logins, simple database queries, pam logins, mail serving, simple firewall etc, you can get away with a simple Athlon64 server for upto 10,000 users. Remote profiles in windows is REAAALY heavy, think of the delay when youre logging in on someone else's workstation... Linux's authentication isnt that heavy at all.
Just get an xSeries 206 ($500) and be done with it.
oooorrrrrrrr (Score:1)
this is a simple file serving problem... (Score:1, Interesting)
Do you have enough network bandwidth? 1000 computers will easily saturate fast ethernet connections. You need gigabit ethernet or better.
And frankly, a bunch of blade servers isn't a good way to solve a file serving problem. You want either a bunch of NAS boxes (network attach storag