Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Networking

Testing Pre-Production Servers Accurately? 48

An anonymous reader asks: "Having been granted a 90-day demo enclosure of new blade servers from a major vendor the question I find myself asking is: On a limited budget, how does one simulate 1000+ attached clients and the activity of those clients? We're a K-12 school district and our current servers don't keep up with all the roaming-profile abuse from our Windows workstations. Are there tools or tricks available to simulate load on Netware/Linux servers? The user groups around here usually answer this question with 'Get some workstations for a test lab!', there's got to be a less expensive option, right? Can we leverage our existing client populous to achieve our goal, without interrupting or changing the quality of service at the desktop, substantially?"
This discussion has been archived. No new comments can be posted.

Testing Pre-Production Servers Accurately?

Comments Filter:
  • Its extremely difficult to create useful benchmarks of high loads. Leave it to professionals. Find or buy an evaluation of the type of system that you are interested in. End-user reviews of systems that they have purchased would be great, but too many manufacturers will try to quash negative comments or inject positive comments, so its hard to find unbiased advice.
  • by martin ( 1336 ) <maxsec.gmail@com> on Friday May 13, 2005 @12:36PM (#12520914) Journal
    Talk to the Mercury test people and others about load generation. They will sell you some nice programs to unit and load test systems...
  • by Anonymous Coward
    Store some MP3's and some porn, especially mpegs, on it and then connect it to the network. You don't even need to announce it but, within hours every workstation on your network will be hitting the servers and if it fails or is removed at a later date, no one will have the nerve to complain.
  • Login Scripts (Score:5, Interesting)

    by dJCL ( 183345 ) on Friday May 13, 2005 @12:44PM (#12521026) Homepage
    Just add a program to the login scripts that runs some tests against the server in the background.

    Then when everyone logs in in the morning, it just overloads the thing...

    And simulates the load spikes you see. I don't know a program or script that would do this, but I'm sure someone out there does.

    JC
  • by plsuh ( 129598 ) <plsuh&goodeast,com> on Friday May 13, 2005 @12:49PM (#12521109) Homepage
    We're a K-12 school district ... Can we leverage our existing client populous [sic] to achieve our goal, without interrupting or changing the quality of service at the desktop, substantially?"

    You're gonna hate the answer, but this will give you a better test than anything else. Plug in your test system and get a bunch of the kids to help you out on a weekend. Have them do logins, logouts, play games, surf, write and save papers, etc. on throwaway accounts that go to the test server.

    Write out a test plan -- how many clients, how many local, how many remote, how many do you start with, what is the step size (e.g., start with 5 clients, then 10, then 15, then 20, then 30, etc.). Profile your existing systems so that you know what's really creating the load on them. Is it really the roaming profiles or is it web site caching or is it something else? Good luck with it.

    --Paul
  • What's "roaming-profile abuse?"
    • Perhaps its a reference to the fact that windows *copies* your profile to each machine when you log in and so if users have hundreds of megs in their docs etc.. then the network quickly becomes busy.
      • Once again I thank my lucky stars that I was born a Unix geek and not a Windows geek... :'}

        Thanks for the explanation!
      • by JonathanX ( 469653 ) on Friday May 13, 2005 @01:28PM (#12521565)
        So why not just point each user's "My Documents" folder to a network share? This will keep the profile much smaller. I guess I just took it for granted that this was a standard practice. I've done it for years.
        • by Anonymous Coward on Friday May 13, 2005 @02:16PM (#12522151)
          So why not just point each user's "My Documents" folder to a network share?

          Because most admins, > 60%, don't RTFM and therefore are unaware that the My Documents folder can be redirected through the use of Group Policy. Even more admins, > 80%, don't know that the IE cache directory should also be redirected to prevent multimeg profiles due to browser cache.

          Of course, even those admins, that go to the trouble of doing these things and many more to minimized the profile size, are torpedoed by the users that save their downloads and documents to their desktops. I even saw an admin with a Windows 2000 service pack(>120MB) saved to his desktop while using roaming profiles.

          The fact is that the roaming profile implementation is crap! Nothing should be copied over the network at login. Everything should be stored on a network drive and accessed directly from there. This is how Unix does it with remotely mounted Home directories. That way files don't cross the network if you don't access them. If you want to implement caching for slow links or disconnected access as Windows does, that's fine, but copying everything over the wire everytime someone logs in is just stupid.
          • And the really smart admins of course put a quota on the profile space to stop people abusing them buy dropping zillions of files on the desktop.
          • If you want to implement caching for slow links or disconnected access as Windows does, that's fine, but copying everything over the wire everytime someone logs in is just stupid.

            And Windows doesn't - it only copies *changes* (although, obviously, if it's the first time a user logs in the whole profile will be copied).

          • Do you have a URL for some documentation describing how to do that without using a Windows 2000 server? Meaning on a network with a Linux Samba server and Win2k/XP workstations as clients.
        • Because in WinXP, copying the whole profile gives you the user's application settings, cookies, desktop, etc....

          it is nice to be able to log on anywhere and have everything be exactly the same.

          On the other hand, I do think the way in which microsoft handles this is a tad inefficent. I think novell offers a better solution to this, but I honestly don't have a clue.
        • Speaking as an IT person at a K12 school district, I can tell you one thing that you can't do: you can't always use group policies the way you want to. In our instance, our infrastructure is pretty good (Fiber from our NOC to every school and admin buildng, gigabit ethernet to the MDFs, 100MB to the desktops, a mix of Win2K and Win2K3 servers (not my choice...)) But our clients, about half of them, bite balls. Pentium 1 machines, running Windows 98 and NT, and little or no money to actually replace them. 9
      • I really hate the way Windows handles roaming profiles. One thing that you can do to partially alleviate the problem is to prevent the local machines from caching profiles.

        In the registry create a value named DeleteRoamingCache of type REG_DWORD and set it to 1 under the HKEY_LOCAL_MACHINE\Software\Policies\Microsoft\Wi n dows\System registry subkey.

        This doesn't solve the delays in copying the profiles around, but it does prevent them from filling up the hard drive. I particularly like this because it eli
      • Perhaps its a reference to the fact that windows *copies* your profile to each machine when you log in and so if users have hundreds of megs in their docs etc.. then the network quickly becomes busy.

        It doesn't copy the whole profile unless the user is logging into that machine for the first time, it only copies changes.

  • Well just to start i'll tell a bit about my self, Im a grade 12 student at Bishop O Byrne high school.

    If you have 90 days get the system setup to do everything on the network as fast as you can (you only have 90 days) set it up as if you where going to replace your current computers and then do just that, put it where it is suppose to go if you where to buy one. See the load it gets and log everything.

    Your not part of a multi billion dollar company teachers and students can make it without the internet (i
  • I can nearly guarantee that whatever company is providing you the blade servers has already done a ton of quantitative testing [or had someone else do it for them] for the sort of thing you're looking for. Just ask them for the data.
  • Talk to the AP computer science teachers about getting their students to develop a load generator solution. It should make an interesting extra credit assignment for them.

    If nothing else, put it on the network and invite them to use it as much as possible.

    • This is exactly why the computers in our CompSci lab don't have internet access.

      It's fascinating that they managed to still allow file sharing to work and let us access our folders, but still completely remove the TCP/IP stack from Windows.

      This is also the reason they won't upgrade past Win98 -- you can't remove the IP stacks from 2000/XP.

      As you can imagine, my school treats its students like criminals and is quite backward in many of its policies... they're scared to death that one of us is going to hij
  • by crimethinker ( 721591 ) on Friday May 13, 2005 @02:14PM (#12522123)
    I'm surprised that no-one has mentioned the obvious solution: set up a webserver and post the link to slashdot. References to Natalie Portman, hot grits, and torrents of a leaked telecine rip of Episode III should help.

    -paul

  • by Anonymous Coward
    I work for a high school district and roaming profiles are nothing but trouble. I've found that it works far better to just not use roaming profiles at all. Reasons for this are:
    1. Roaming profile become enormous. Roaming profiles would often grow to over 20 MB. A single file doesn't take long to copy, but roaming profiles consist of many smaller files, often times over 5,000 files. 20 MB divided up into 5,000 files takes a significant amount of time to copy. In a school setting where someone could log
  • by machinecraig ( 657304 ) on Friday May 13, 2005 @03:33PM (#12523056)
    This is very do-able, but there are a few gotchas. Your plight made be think about how I would do it if I were in your shoes...here's what I would do:

    1. Come up with a (hopefully short) list of items that need to be observed under load. For example, authenticating users and allowing logins, serving up files).
    2. For each testing item identified, find a repeatable, scriptable method of testing it. Perl will be your best friend here. For questions consult the great folks at perlmonks.com. You can find very nice modules that will allow you to create scripts that can authenticate to your test server, and report back a time. If you dump all your responses into a csv format, it will make the reporting of the data even easier.
    3. Write a separate script to track system performance during the testing period. I couldn't actually tell from your question if the servers you are testing are windows or *nix , but whatever the os, you'll probably want a snapshot of overall CPU utilization and Run Queue size, swap use, disk access times, and memory consumption.
    4. Be sure you are testing what you think you're testing! You're testing results will be invalid once you hit a limiting factor. So if you're network link gets overloaded by your test script, then your file transfer test is not a good one.
    5. Design your testing scenarios carefully. It's probably best to have as short a test list as possible, and then make them as iterative as possible. For example, first test with having 1 users logon and transfer a file. Second test has 10 simultaneous users logon and transfer a file. Third test has 100 users involved, etc.

    The key to all of this stuff, is you want tests that are repeatable, valid, and indicative of production uses. The nice thing about developing these test scripts is that you can use them to test hardware from several diferent vendors. If it seems daunting, just remember that it's better to spend a week of late nights developing the scripts - then to spend the next several months wondering if you're implementing hardware that will hold up to your production workload!
  • Come on.

    You buy blades to serve 1000 users on Linux?

    For most purposes like openldap logins, simple database queries, pam logins, mail serving, simple firewall etc, you can get away with a simple Athlon64 server for upto 10,000 users. Remote profiles in windows is REAAALY heavy, think of the delay when youre logging in on someone else's workstation... Linux's authentication isnt that heavy at all.

    Just get an xSeries 206 ($500) and be done with it.
  • by Anonymous Coward
    There is lots of test data out there. You could use the serverbench test suite from PC Magazine - it's free. It uses a bunch of client PCs to generate huge amounts of network file serving load (read/write/bigfile/small files).

    Do you have enough network bandwidth? 1000 computers will easily saturate fast ethernet connections. You need gigabit ethernet or better.

    And frankly, a bunch of blade servers isn't a good way to solve a file serving problem. You want either a bunch of NAS boxes (network attach storag

BLISS is ignorance.

Working...