

Website Load Testing Tools? 31
bLanark asks: "I'm in the process of converting one of my clients from IIS/CGI to Apache/mod_perl. I need a (free) web site stress/load test tool to prove that performance will be increased.
Using this page as my starting point, I can see that there are quite a few tools to investigate. Has anyone used any of these (or any others) and what are they like. I need HTTP GETS, form POSTing, and clever stuff like simulation of caching of images would be useful too, I guess." The previous story didn't get much of a response, but that was about a year ago, but the submittor has shared a fairly impressive list with us, impressions about any piece of software on it would be greatly appreciated.
Easy (Score:5, Funny)
New unit of site capacity (Score:4, Funny)
Re:New unit of site capacity (Score:1)
Re:New unit of site capacity (Score:2)
Watch for the browser distribution to shoot from 0.01% Mozilla to about 95% Mozilla!
But seriously... a Slashdotting usually takes the server from light load to an extreme load in a matter of seconds... remaining at max load for about 4 hours (~2 million unique visitors an hour) and then dropping back to normal over the next 12 hours.
Re:New unit of site capacity (Score:1)
Re:New unit of site capacity (Score:3, Funny)
That's 83,000(rounded) views, if you have a single page with no pictures. I suppose we can use that as a basis, and people can calculate out from there, if they want to. So:
1 Slashdot (Sd) = 83,000 views (per hour).
1 CentiSlash (cSd)= 830 views
1 MilliSlash (mSd)= 83 views
1 MegaSlash (MSd)= 83 million views
The average high volume news site (CNN, ABC, Register) can probably withstand 2-3 Sds, while a potato powered, or C64 server could only withstand 10-20 cSds. Of course, the lower powered servers will have less to download, so a single Sd may be managable.
Again, this gets into how many pages/images Slashdot is linked to.
Further study is advised.
Re:New unit of site capacity (Score:1)
This includes the activity which gets bopped around the Internet and doesn't get to your server.
Re:New unit of site capacity (Score:1)
Perhaps this should be adjusted, IMHO. 1 slashdot could be the number of simultaneous HTTP requests that is needed to take down a machine, with a varying scale. This is variable though - some machines (remember our gamer table friend) can't withstand much activity before melting, others can take insane loads, run SETI, run a Quake server, recompile their code, process multiple MySQL requests, and execute endless loops in less than one second without so much as blinking. (Hey, I want that last box!)
Great, thanks to slashdot, I'm on the search for a defineable unit =/
Designing the tool (Score:3, Insightful)
I would also think that it would have to be distributed (at least across several computers) to ensure that bandwidth and cpu/memory of the testing computer are not the limiting factors.
How hard can it be to write something like that? The hardest part would probably be parsing the log files. I would not be suprised if there are several testing tools that would do an adequate job.
wast (Score:1, Informative)
Apache comes with one (Score:3, Informative)
Be sure to simulate "real life" (Score:4, Insightful)
In a previous job we needed to do some benchmarking and testing. The first pass was to script some stuff using wget. Once everything looked fine (we could really slam the server) we turned it loose on the public.
Oops. The site collapsed nearly instantly. The problem was that people with slow modem connections kept connections active for a couple orders of magnitude longer than happened on our internal network and the server ran out of resources.
Microsoft's free tool can simulate a mix of connection speeds and I believe you can find similar functionality in many of the web test tools you will find in a freshmeat.net search.
MS's free tool. You mean Homer? (Score:2)
What it doesn't let you do is make a hop from HTTP to HTTPS, nor does it simulate caching of images, nor many other things that an actual user would do. JavaScript is right out. For small, simple apps, it worked OK, but start getting complex and Home just can't handle it. D'OH!
It isn't free, but... (Score:1)
-j
Simulating "real life" is tough. (Score:2, Insightful)
Re:Simulating "real life" is tough. (Score:1, Informative)
Oh -- another way to detect bottlenecks that 'ps' won't show you is to wrap all your CGI with start/stop log commands. When you turn on a setting it causes all your scripts to log start time, end time, CPU time, etc. Helps you find where to study further.
Siege (Score:3, Informative)
I really like one of the other poster's idea about having a load tester read actual log files from Apache, then simulate real user activity. The only problem I can see with this method, is if you changed the layout of your site, all the program would get is a bunch of 404s. However, if one were so motivated, one could hack up such a thing relatively easy, I think. analog [analog.cx] can parse Apache/httpd log files, could'nt be all that hard. Siege works well for me, though, so I'll stick with it.
Re:Siege (Score:1)
Using scout and siege, with even a simple shell script, you can blast the bejezus out of a web site. (Oops, may not want to say that on slashdot...)
Re:Siege (Score:1)
JMeter is cool... (Score:1)
I've also used (Mercury's) LoadRunner, a fairly expensive and complex beast; it comes with a built in proxy which allows you to capture a browsing session as the basis for a test scenario - this is a particularly useful feature. The scenarios can be scripted, and there's a bunch of random timers you can use to represent "real-world" behaviour.
I'd suggest creating 2 types of test scenarios : 1 type to represent your best guess at "real world" usage, i.e. representing users browsing, spending time reading a page, clicking all over the shop etc. - this should give you a good idea of how your application is going to perform.
The second type of test should provide you with metrics you can use when making improvements once you spot bottlenecks. They don't necessarily represent "real-world" activity, but allow you to measure whether code changes have made a difference. For example, you might have a test where all users log into the application simultaneously. In version 0.1 you can support 10 concurrent users; in version 0.5 100 concurrent users, and in version 1.0 250 users. These numbers don't represent "real" users - how likely are they to log in all at the same time ? It's a lot easier to manage your performance improvement work using these metrics.
Re:JMeter is cool... (Score:1)
FYI, it also does the http proxy recording thing now.
Check out The Grinder (Score:2, Informative)
The Grinder (http://grinder.sourceforge.net [sourceforge.net]) is an open source Java based load generating tool. It is multi-threaded, distributed, scriptable and has a central statistics reporting mechanism (console). It was used extensively by the authors of J2EE Performance Testing with BEA WebLogic Server and that book has excellent instructions and recommendations about its use.
I suggest you check it out.
JMeter, Siege, Mercury Interactive (Score:1)
Good luck!
The Grinder, again (Score:3, Informative)
Among (many) other things, The Grinder has a built-in proxy that allows it to record a browsing session and play it back later. Basically, you start the proxy, set your browser to use this proxy, browse your website, and get back a log of your actions complete with timings and POST values. One other cool features is that it let you define your own datasources so you can fill POSTs and GETs with custom data. Last but not least the author of the tool will personally answer your questions, albeit slowly.
Burn Karma Burn... (Score:2)
Check out the WAST. It's free, and IMHO, better, and easier to use than ab (apache benchmark)... although, it's comparing tangerines to oranges.
http://webtool.rte.microsoft.com/ [microsoft.com]
S
The tool to use (Score:2)