Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Programming IT Technology

What Gives The Best Embedded Perl Performance? 7

ChetPan asks: "I'm doing some development for a couple of student-run sites at NC State University. We want to make dynamic portal-like customizable pages, but aren't really sure what to use to output the actual HTML, since the main concern here is performance. We have most of our code in Perl modules and we're wondering about the main (heavy traffic) pages. I don't know much about embperl or eperl and how they compare to regular mod_perl under Apache. Are there other embedded Perl solutions? Would we get a performance boost by using PHP? Unfortunately I haven't seen much neutral comparison of the different embedded types on the Web. I would especially appreciate seeing some benchmarks and/or statistics on the topic."
This discussion has been archived. No new comments can be posted.

What Gives The Best Embedded Perl Performance?

Comments Filter:
  • ...it's the overhead. If you can make due without the "generic" Perl/HTML pieces like Embperl you are probably going to mainly notice a difference in memory. You probably won't notice a speed difference between a purely custom interface written directly for mod_perl versus a more generic one from CPAN. But what you will notice is that the memory requirements of the generic one will be higher.

    But since a mod_perl enabled Apache process is already vastly larger than a non-mod_perl process, the little bit of overhead for Embperl or whatever won't make much of a difference. The REAL trick is to make sure that ONLY dynamic content is served from mod_perl processes and everything else is served from slim apache processes. mod_rewrite, mod_proxy, and mod_include will be very helpful for this.

    --Mike
  • It won't win any "Hello, World" benchmark contests (and, therefore, is probably unsuited for your described needs), but Mason [masonhq.com] is another "embedded perl" candidate, which is well-suited for robust application development.
  • by schon ( 31600 )
    The latest version of Roxen webserver [roxen.com] supports embedded Perl (among many other languages)..

    from the Roxen Perl Support Module page:
    With the "Script output" option of the perl support module set to HTTP, it is possible to run old perl CGI scripts more or less as they are, only with better performance, provided they don't rely on their environment being reset upon each run of the script. (This performance boost comes in part from the script already being loaded and compiled, thus staying resident in-between requests and in part from the fact that there is no need to fork off new processes for the script.)


    It also supports the mod_perl API, for compatability
  • by crisco ( 4669 ) on Wednesday December 27, 2000 @07:39AM (#540127) Homepage
    He mentions customizable, which I interpret as dynamically generated for at least the visitors that take the time to customize the page.

    HTML::Template might be a place to look, while it doesn't give you embedded code in a page it forces you to separate content from presentation, which is often a bigger advantage down the road.

    There have been some unofficial benchmarks of the various technologies (here is one, but it is biased [caucho.com]) but most I've seen are nearly irrelevant because the best thing they test is the looping ability and startup time of the various technologies. And none of these covered the very thing you ask, the page embedded solutions for Perl.

    From my point of view, most of these technologies are on par with each other, there aren't going to be orders of magnitude differences by switching.

    The mod_perl performance pages [apache.org] offer quite a bit of information on tuning your application, I wouldn't be surprised to find that your biggest performance obstacles are in the code itself. You mention 'we', how about breaking up the different embedded solutions and picking a subset of your capabilities to implement and benchmark. Then you can tell us "Which one is fastest?" Better yet, you can give us a clue as to which one is better to develop with, any hidden obstacles to a good design and so on.

  • by khkramer ( 31219 ) on Wednesday December 27, 2000 @09:29PM (#540128)

    I would agree with the posters who have recommended HTML::Mason (http://www.masonhq.com). Here's why:

    You can't have everything. For a given amount of money spent on web-server hardware, you can balance some combination of 1) fast, 2) dynamic, and 3) maintainable/extensible. For most software development work, the third consideration is by far the most important. And even for situations where efficiency is critical and labor is cheap, the third consideration is still the most important <laugh>.

    HTML::Mason is the best of the present possible worlds. Its "component"-based system lets you build extremely, granularly dynamic web sites. Code reuse is so simple and natural that you might even find yourself practicing it. And you can do anything from Mason that you can from Perl, so there's no lack of power or flexibility.

    And Mason has a big advantage over ([most/all]?) other embedded perl solutions: pretty good built-in cache control. Even if you don't care about maintainability, there's still an inherent trade-off between serving pages quickly and serving them dynamically. The only practical way to turn the knob on this trade-off is to do some kind of caching -- preferably in a highly configurable fashion. Mason does per-page and per-component caching with a few (relatively) simple calls.

    Benchmarks are only useful if they approximate your usage patterns pretty exactly, and there are almost certainly no such benchmarks out there that will satisfy you. The consensus on the Mason list is that Mason pages will serve about 2-3 times slower than similar pages coded in straight mod_perl. That speed difference is way, way, way more than made up for in efficiency of development (which includes things like the edit/test cycle, minimized debugging, code reuse, etc).

    And other than caching, there's very little Mason-specific performance tuning. Most everything you do to make Mason run well is really an effort at making mod_perl run well. Once you have a nice, two-tier mod_perl system set up, and are caching as many Mason components (or pages) as is practical, you are, in my experience, much more likely to be running short of memory than running out of CPU. We serve about 200,000 Mason page-views per day from each of our 1G of ram, dual PIII-750 boxen. We cache almost all of our pages at the whole-page level for at least 30 seconds at a time, but we do serve a fair number of entirely-dynamic or heavily-customized pages. And we only bog down our processors when users do lots of archival searching, which unfortunately isn't something we can blame Mason for.

    Yours,
    Kwindla

  • by po_boy ( 69692 ) on Wednesday December 27, 2000 @07:56AM (#540129)
    I would reccommed that you take a look at Mason [masonhq.com].

    I'm not sure if the performance is as good as some other soutions, but I've written some pretty big, fast sites on it and I know others have. It runs under mod_perl, so that makes it a bit faster, but does limit you to apache, I guess, in case that matters.

    It's a pretty good way to make compnent-based web applications and lets you seperate the HTML stuff from the PERL a little more than some solutions. This means that the artsy people can hack some components and the coders can hack others.

    The component based aspects of it make it my tool of choice when I'm writing a set of pages which have a lot of little seperate boxes, like a portal or something. It makes it a lot easier to keep track of what pieces of the pages are coming from which pieces of code.

    Hope it helps.

  • by fosh ( 106184 ) on Tuesday December 26, 2000 @10:59PM (#540130) Journal
    From your description, it sounds like this "portal page" is going to change fairly often, but is not going to be dynamically generated for each visitor. If this is the case, you might get the best performance by running the perl script from a cron job every 30 mins or so, and have it output a static HTML page. If you were looking to create a more modular site, then perhaps you should implement something like this modularly. THat is, have each module run every n number of miniutes, and generate a static "pagelet" for itself. Then, the main page just concatinates a bunch of these together. I would guess that this sort of system would be by far the fastest.

    Good Luck
    --Alex the FIshman
    (sorry for the horrible grammar, its late...)

Software production is assumed to be a line function, but it is run like a staff function. -- Paul Licker

Working...