Follow Slashdot blog updates by subscribing to our blog RSS feed


Forgot your password?
Chrome Firefox Opera The Internet

Slashdot Asks: Why Are Browsers So Slow? ( 766

Designer Ilya Birman writes: I understand why rendering a complicated layout may be slow. Or why executing a complicated script may be slow. Actually, browsers are rather fast doing these things. If you studied programming and have a rough idea about how many computations are made to render a page, it is surprising the browsers can do it all that fast. But I am not talking about rendering and scripts. I am talking about everything else. Safari may take a second or two just to open a new blank tab on a 2014 iMac. And with ten or fifteen open tabs it eventually becomes sluggish as hell. Chrome is better, but not much so. What are they doing? The tabs are already open. Everything has been rendered. Why does it take more than, say, a thousandth of a second to switch between tabs or create a new one? Opening a 20-megapixel photo from disk doesn't take any noticeable amount of time, it renders instantaneously. Browsers store their stuff in memory. Why can't they just show the pixels immediately when I ask for them? [...] Unfortunately, modern browsers are so stupid that they reload all the tabs when you restart them. Which takes ages if you have a hundred of tabs. Opera was sane: it did not reload a tab unless you asked for it. It just reopened everything from cache. Which took a couple of seconds. Modern browsers boast their rendering and script execution performance, but that's not what matters to me as a user. I just don't understand why programmers spend any time optimising for that while the Chrome is laughably slow even by ten-years-old standards.Do you agree with Birman? If yes, why do you think browsers are generally slow today?
This discussion has been archived. No new comments can be posted.

Slashdot Asks: Why Are Browsers So Slow?

Comments Filter:
  • Why they are slow? (Score:5, Insightful)

    by nospam007 ( 722110 ) * on Wednesday December 21, 2016 @12:40PM (#53530431)

    Because we need a canvas-blocker add-on and a flash-blocker add-on and a cookie warning remover add-on and add-ons to remove all the ads on the page and we need add-ons to remove the dozens of trackers to protect our privacy and also add-ons that remove all the social buttons (twitter, FB, etc) which share our behavior even if we are not a member and add-ons that remove all the javascripts that load popups, do unwanted refreshs and Greasemonkey to make some pages readable and remove the ads that are inside the articles and we need an add-on to circumvent the anti-adblockers and .....

    • by nucrash ( 549705 ) on Wednesday December 21, 2016 @12:47PM (#53530499)

      Pretty much this.

      I feel that so many websites have gone to insane levels on ad loading that the web has become almost unusable by some standards.

      Yes, I say that with at minimum, 12 tabs open, but most problems I have are around specific ad revenue driven, ad block loathing sites.

      • by JaredOfEuropa ( 526365 ) on Wednesday December 21, 2016 @12:52PM (#53530555) Journal
        It's not just ads; a lot of websites pull in JS helper scripts from other sources (instead of hosting local copies of their own). And those sources do not always have the best performance.
        • by SQLGuru ( 980662 ) on Wednesday December 21, 2016 @12:57PM (#53530627) Journal

          The idea of not hosting your own local copy is that if everyone uses a common source for the library, then the browser should already have that resource cached even if it's the first time you hit the page. For large enough libraries that the Google CDN (or another CDN) includes it, it's worth it.


          Now, for obscure libraries, you're better off hosting your own so that you can control the version and not worry about the referenced script changing/moving/breaking stuff.

          • by holophrastic ( 221104 ) on Wednesday December 21, 2016 @01:20PM (#53530933)

            You're wrong. I've been doing this for three decades now. If you're serving a page to a client, sending them to a third-party to get the library always sucks. Even if they've already got the library disk-cached, it's actually slower to access the disk cache, and check the cache age, and verify that there isn't a newer library version (did you know the browser often goes round-trip just to check?) than it does to simply serve the library in-line.

            Serving the library yourself can still run in parallel, and it often reuses the same primary connection, so it's about as fast as your server can handle.

            Now you're going to mention the browser's memory cache, instead of disk cache. First, nowadays, with each-tab-in-a-separate-sandboxed-process, those memory caches ain't as fast as they once were. But even when they are, you just ain't a'gonna beat in-line scripts.

            Benchmark it yourself. Serve 100KB of javascript in-line, in the middle of your html file. Compare that to a separate src= js file. Mid-stream, in an HTML file of another 100KB, the javascript runs at full download speed, with full text transmission compression. Those 100KB easily compress down to 50KB, more often 25KB, mid-transmission, and at any modern residential bandwidth, you're talking about the tiniest of fractions of a second. No disk access, no file handles, no separate rendering processes, no sandboxes, and, most importantly, no virus scanners, no swapping.

            But it does eat up your server's bandwidth costs.

            If you're huge, amazon style, then you want to off-load cycles and bits anywhere you can. If you're not huge, then the added 10% bandwidth costs mean nothing to you.

            • by lgw ( 121541 ) on Wednesday December 21, 2016 @01:33PM (#53531107) Journal

              You're talking about fast, as seen by the user. They don't care about that at all - they're optimizing for fast as seen by the server: less bandwidth. Anything they can offload is a win, no matter how much the user experience sucks.

            • You're wrong. I've been doing this for three decades now. If you're serving a page to a client, sending them to a third-party to get the library always sucks.

              I agree, partly for the same reasons and partly for different reasons.

              Extra lookups to remote sites suck- more time, more misdirection, etc.
              You run the risk that the remote site has been compromised and is now serving up something malicious.
              Similarly, you run the risk of MITM stuff taking place.
              If they upgrade some file and it introduces an incompatibility it may very well break something on your site.
              If the remote site is down, guess what? So is your site.

              I always host my own library files, period. GeoIP t

            • by Anonymous Coward on Wednesday December 21, 2016 @04:07PM (#53532655)

              I really don't mean to attack you personally (really, I don't) but your post is a great example of a huge and growing problem in society in general: people who are very very sure of themselves, write well, sound authoritative, but are WRONG.

              Most, if not all, of your argument is based on fast network. Far too many programmers these days just assume huge network bandwidth. They think nothing (MS) of starting a "background" update process which can try to download GIGABYTES of unwanted crap.

              News flash: not everyone has 10GB Internet, Core i7 or better CPU, 16 GB RAM, 1 TB SSD, etc.

              And stop the arrogant attitude of telling everyone they're using dinosaur computers, to "upgrade" everything, etc. By expecting everyone to be on fast Internet, etc., you're shutting out a chunk of your market. I've stopped using many major news sites because a redesign not only looks terrible (if it's readable at all) or takes so long to load it's a waste of my time. The advertisers need to know this, right?

              My poor mom bought a "smart" phone (Android) 18 months ago. Several well-meaning (over-confident, sure they know everything tech) "helped" her by installing many apps. She did not buy a huge data plan, and in 1 month was dozens of times over her data limit- just in the apps constantly 1) phoning home, and 2) updating themselves. The provider was kind enough to forgive her and she gave me the phone.

              And while I'm at it, how about an International law stating that NO website will ever be allowed to download, stream, or otherwise hog network bandwidth without the users 100% knowledge AND permission. Example: I'm trying to read a news article, and an unwanted window starts playing a video without my request nor permission.

              How about ALWAYS, with NO exception, asking the user how much bandwidth they are OK giving up? How about testing network bandwidth BEFORE doing huge downloads? And don't forget latency while you're linking nested .js from around the world.

              About 60-70% of my browsing is done on Old Opera (11.x) because you would be stunned at how much faster most websites are with javascript OFF. I know, all too well, how many websites barely work, or not at all, with javascript turned off. I've studied the code and I see they render in javascript to force you to turn it on because their trackers and ad popups need javascript.

        • by fuzzyfuzzyfungus ( 1223518 ) on Wednesday December 21, 2016 @01:15PM (#53530885) Journal
          This would involve admitting the (arguably ugly; but undeniable) truth that web pages have become a somewhat dysfunctional software development platform; but it'd be nice to see browser caching mechanisms that don't just treat giant JS libraries as a special case of cacheable assets generally and hope for the best; and instead offered something more in the vein of what Linux package managers do.

          If you host your own; the user ends up downloading a copy even if it is identical to the one they pulled from somebody else for a different site that uses the same library. If you avail yourself of Google's...generous...hosted offer, the user is more likely to cache; but then every load of your page also involves a chat with Team Mountain View.

          Ideally, you'd specify that you use JSLibXYZ, with cryptographic has and/or signature specified, and at least one source for a client who doesn't have it to get a copy; but the user's browser could make use of JSLibXYZ from any source, so long as the hash and signature match, since that provides assurance that it's the same thing. Caching mechanisms based on domain of origin are sensible enough for assets that are likely to be site specific(pictures, stylesheets, etc.); but, like it or not, giant javascript libraries, many shared across a large number of otherwise unrelated sites, have a lot more in common with dlls than with images; and it would be handy if they were treated as being a different case. With hashes and/or signatures, you could ensure integrity even in cross-site sharing situations; allow caching to work regardless of how the site operator has set up script hosting, and end the choice between 'serve yourself, lose out of caching/let google do it, get caching but add a 3rd party to everything you do'.
          • by bmo ( 77928 ) on Wednesday December 21, 2016 @01:32PM (#53531071)

            >somewhat dysfunctional

            DOOM was 16MB. A whole game in 3D with full on-the-fly animation and sound effects and music. That was amazing for the time when Myst was a game that served up mostly static images.

            A web page mostly serves up text, static images, and sometimes video.

            Yet this needs 16MB to be downloaded to be functional, somehow.

            >somewhat dysfunctional

            Try completely dysfunctional

            Oh, and to the assholes who put in autoplay video that you can't turn off/pause: die. in. a. fire. Your pages actually last 5 seconds in my browser tab, and I'm sure this is the other case with people who are either on mobile, or just simply want to be quiet.


            • by bmo ( 77928 ) on Wednesday December 21, 2016 @01:33PM (#53531105)

              That should have a "less than" sign in front of the "5 seconds"


            • by npslider ( 4555045 ) on Wednesday December 21, 2016 @01:44PM (#53531207)

              Just imagine every wall in that doom game filled with dynamic ads for health insurance (before you are fragged of course), places to comment on which weapon you like better (to be posted to social media)... oh and the whole game is just a conglomeration of Adobe flash videos.

            • by fuzzyfuzzyfungus ( 1223518 ) on Wednesday December 21, 2016 @02:30PM (#53531707) Journal
              No disagreement here on the quality of craftsmanship that goes into the average webpage. On the other hand; can you imagine how...glorious...the web would be if everyone who didn't have John Carmack on staff still needed to hammer out a bunch of clever tricks in C to get a site up and running?

              It isn't at all elegant; but unless someone discovers a way to churn out good programmers with unprecedented efficiency, I can't say too many mean things about tools that let terrible programmers get merely bad results, since we appear to demand more software than we have talent to supply.
        • by mysidia ( 191772 ) on Wednesday December 21, 2016 @01:19PM (#53530911)

          It's not just ads; a lot of websites pull in JS helper scripts from other sources (instead of hosting local copies of their own). And those sources do not always have the best performance.

          What we should do is get the web standards updated to add a Rule:

          Remotely-loaded scripts (Loading from a different server) must specify a SHA256 sum in the Script tag such as
            <script src="" integrity="sha256-7d774a8ff0e73f2791c3a12dfc3ef1f9a1a640d470584b9b9222d395e8519fc5">

          If the hashcode is not specified, then the script will not be run.

          If the hashcode IS specified, then it can be cached by any 3rd party.
          The URL becomes "Advisory", and Popular hashcodes can be distributed by Amazon, Google, or your local ISP.

          There should be a way to have a DNS suffix DNS record to specify local object-caching servers that can be queried by code.

          Caching is permanent. An outage of the original source server has no affect.

      • by Anonymous Coward on Wednesday December 21, 2016 @01:14PM (#53530875)

        The web is unusable. When loading any website containing content intending to be read (news article, blog post, &c.), I poise my mouse over the "reader mode" button and stab it the very instant it appears.

        I will not even try to read content on pages with moving shit in the side-bars, dropping-down menu bars, random crap that unpredictably overrides scrolling, popups sliding up and over and all around the fucking place, sometimes minutes after the page finished loading because they know that's how they can irritate you into looking at it.

        If the reader mode button doesn't show up, I don't even bother trying to use the page. I just close it.

      • by Impy the Impiuos Imp ( 442658 ) on Wednesday December 21, 2016 @01:35PM (#53531117) Journal

        Chatty scripting requiring lots of back and forth. Serving up the ads first before content, trying to make sure the ads are seen. Interacting with ad customization services to eke out an additional fraction of a cent. Scripting that downloads higher quality video in a sidebar ad than the real site. Flash or other crap that runs at CPU max load rather than 99% of it, because the browser makers want that fastest script award and to hell with lockup.

    • by MouseR ( 3264 )

      Safari runs a web page as it's own process for security reasons.

      Every new window starts up a process. Open process viewer, filter on Safari, the just create empty tabs. You'll see the process list grow.

    • The amount of crap most commercial sites put into their pages is amazing. It's not only slow, but a security risk.

      A non-profit organization can set up a "K.I.S.S." standard and create a minimalist browser. If a site works fine in the KISS browser, they can place a logo on their site to advertise they are KISS-compliant (or have a KISS-compliant alternative site/page, which the KISS browser would automatically redirect to.)

      It's kind of like the concept behind Underwriter's Laboratories. []

      KISS wouldn't allow JS

    • by AmiMoJo ( 196126 )

      Those add-ons are supposed to make the browser faster. I'm running Chrome on a fairly unremarkable machine with the many add-ons, and new tabs open instantly, a fraction of a second, as fast as I can ctrl-T and start typing an address/search query.

      Add-ons include:
      Google Docs
      PDF Viewer
      Privacy Badger
      uBlock Origin
      + many more

      I think the questioner's machine is either crap or bogged down with malware.

    • This.

      I am shielded from within the browser so much that I have to grant temporary permission to take one step forward.

      Looking at the stuff that's trying to load, the browser is doing some heavy lifting.

      In the old days (think Netscape) the browser loaded and just sat there with no home page, even.


      Please allow me to step outside the lane here and poise to ponder how much faster and cleaner DOS would be if we had modern technology running a character-based system with no ads. In fact, Netscape [] had that featu

    • by npslider ( 4555045 ) on Wednesday December 21, 2016 @01:13PM (#53530859)

      It's like watching a 7 minute television program in the space of an hour, divided up every 5 seconds by commercial breaks.. and trying to record the whole thnig sans commercials on an old-school VHS deck.

    • by holophrastic ( 221104 ) on Wednesday December 21, 2016 @01:22PM (#53530949)

      Stop instructing your browser to block things. You don't need an add-on to define network connections. Start telling your network stack where to not find them:

      HOSTS-level blocking []

      And if that's not enough, look into PAC files. You won't be disappointed.

    • by Bartles ( 1198017 ) on Wednesday December 21, 2016 @01:27PM (#53531025)

      What's the addon that removes all the socials buttons? I need that one.

  • Opera is NOT sane. (Score:5, Insightful)

    by snarfies ( 115214 ) on Wednesday December 21, 2016 @12:43PM (#53530457) Homepage

    >Opera was sane: it did not reload a tab unless you asked for it. It just reopened everything from cache

    No. That is NOT sane, normal, or desired.

    Webpages are live. If I want to look at a chached version, I'll save the webpage locally. If I'm pointing a browser at an address, I expect a current webpage. If it takes all of three seconds to give me that, I think that's okay.

    • Re: (Score:3, Informative)

      by DigiShaman ( 671371 )

      Now that they're owned by the Chinese, fuck Opera.

      Try Brave. It's very very fast, and blocks all sorts of crap. It's based on the Chromium browser []

      • Forgot to mention; Brave is far faster on iOS than Safari. None of that annoying page hijacking that mousetraps you too.

      • > It's based on the Chromium browser

        fuck that then, the only good engine was Presto back in the days. Anything else is crap when it comes to heavy workloads.

    • by tepples ( 727027 )

      Webpages are live. If I want to look at a chached version, I'll save the webpage locally.

      Which extension do you use to save locally all pages that are open in a dozen or so tabs, so that they can be reloaded back into tabs once you are ready to read them? Or do you need to save each page individually, navigating through your file system to find a folder in which to save each?

      • Which extension do you use to save locally all pages that are open in a dozen or so tabs, so that they can be reloaded back into tabs once you are ready to read them?

        Scrapbook Plus []

    • I do development where I need to look at the source code of a page, to make sure that certain elements are present. The attitude that caching the page for even a fraction of a second is "bad" has made it very difficult to do this anymore - If I ask for the source, the browser claims the page is "expired", and demands to reload it. That changes what I'm looking for, making the request invalid.

      Even trying to save the page locally requires that it be reloaded. It's become nearly impossible to check a page's ac

    • by MasseKid ( 1294554 ) on Wednesday December 21, 2016 @02:35PM (#53531765)
      I disagree, when reopening the entire browser it is a perfectly legitimate use case to go back exactly where you were. Opera has long had the feature to tell the browser to automatically reload a page every X seconds, Minutes, or hours. So if you're use case is you need to constantly need updates on the current thing being hosted, Opera already does what you want. Now, perhaps there is a misunderstanding of what the summary is talking about in terms of reopening a tab. This is only occurs when you reopen a group of saved tabs (generally from the last time you closed it). This does not occur if you simply type an address, click a bookmark, or any other "normal" method of opening a link.
  • by Parker Lewis ( 999165 ) on Wednesday December 21, 2016 @12:47PM (#53530505)

    I agree with the summary. Basically, we have faster engines for rendering for HTML and JS, but the UI is really slow. IE family is the worst in this aspect.

    On Firefox (my default browser), a nice boost in tabs rendering is made turning off the tab animation: changing "browser.tabs.animate" to false in about:config.

    • by Chris Katko ( 2923353 ) on Wednesday December 21, 2016 @01:32PM (#53531087)

      The UI is not slow.

      What's slow is the fact you're loading MEGABYTE APPLICATIONS full of commands through the web that have to be processed before display.

      Go to a 90's era website with HTML3.2. Watch how magically fast everything is displayed.

      Check out the old website, "Internet Explorer is Evil": []

      Watch how magically fast the animated GIF backgrounds display. How magically instantaneous the text renders. Why? NO CSS. NO JAVASCRIPT. No 5 MB app being downloaded.

      And even then CSS and JS don't have to be slow. But people designing these websites simply don't give a shit about speed. Have you ever tried to load a Google Document on a Netbook with an Atom processor? Apparently a dual-core CPU with 2 GB of RAM isn't enough TO DISPLAY TEXT and if you try and multi-task it'll bring your computer to a complete halt. I've literally had to panic close tabs in Linux with my netbook with Google Docs open with other tabs, because if it starts getting laggy, it'll quickly get to the point it freezes the whole computer (>15 seconds between mouse moving) and it's actually faster to reboot the entire thing. (SSD ftw.)

  • by Yvan256 ( 722131 ) on Wednesday December 21, 2016 @12:48PM (#53530509) Homepage Journal

    It's web pages that are filled with useless javascript libraries that people think they still need to use for cross-browser support even though Internet Explorer is long dead.

    It's web pages that are filled with useless ads that run their own scripts, sometimes with their own libraries too, fetched from multiple servers.

    It's web pages that are filled with useless tracking scripts, sometimes with their own libraries too, fetched from multiple servers.

    It's web pages that are filled with huge animated GIFs that should be in video form instead.

    It's web pages that are filled with auto-loading, auto-playing videos, jamming our connection to download something we don't even want to see instead of downloading the web page we're trying to read.

    Disable plug-ins. Disable javascript. You'll see how fast browsers really are.

    • by Calydor ( 739835 ) on Wednesday December 21, 2016 @01:19PM (#53530921)

      I am trying to figure out what part of what you wrote (which is all true, mind you) explains why opening a new tab in Chrome or Firefox has to take as long as it does.

  • by Anonymous Coward on Wednesday December 21, 2016 @12:48PM (#53530513)

    obviously don't care about painfully sluggish programs. Just look at the number of people that find Windows acceptable despite being horrifically slow. If you can get away with inefficient and poor code, why spend the extra money to hire decent programmers to make something not slow. There's no reason why Windows should take tens of minutes after boot before it becomes usable or why clicking on a Windows should usually take more than a full second, but because people keep giving Microsoft piles of cash for giving us crap, they have no reason to fix it.

  • They're not (Score:5, Insightful)

    by kwerle ( 39371 ) <> on Wednesday December 21, 2016 @12:48PM (#53530515) Homepage Journal

    * You have too many slow addons enabled
    * You are out of RAM
    * The page is slow (big/complex)
    * Your network connection is slow/saturated
    * Their network connection is slow/saturated
    * You are out of CPU (unlikely)

    • by bbsguru ( 586178 )

      Given what we ask them to do, browsers (even crappy ones, -cough-IE-cough) are remarkably fast.

      If you are seeing 'slow' performance, try analyzing it objectively. Obviously, another machine with a clean browser environment would be a good starting point for comparison.

      As far as "hundreds of tabs", well... maybe the first step should be to

  • Browsers are fine (Score:5, Interesting)

    by sjbe ( 173966 ) on Wednesday December 21, 2016 @12:49PM (#53530525)

    Do you agree with Birman? If yes, why do you think browsers are generally slow today?

    Not really, no. Tabs open up basically instantly on the computer I'm typing this with. Doesn't matter much which browser I'm using either. I'm almost always limited either by the bandwidth of my internet connection or the slowness of the database on the other end of the line. I hear people bitching about this browser or that one being "slow" and I frankly have no idea what they are talking about. If you have an even vaguely recently built computer with reasonable hardware then it is a non problem. I also see comparisons between the browsers which claim this one or that one is faster and I simply don't see any meaningful difference. The only difference between them to me is the user interface and what bugs I run into. If there are speed differences they are simply too small for me to care.

    Now he did say "hundreds of tabs" which is probably the root of the problem. My guess is that he's overtaxing his computer and running out of RAM. If you have hundreds of tabs open then you are Doing It Wrong.

    I have a 2012 vintage Mac Mini and it runs just fine. Safari, Firefox, Chrome, doesn't matter. I'm typing this on a PC with a Intel i5 chip and an adequate amount of RAM and it's fine no matter what browser I use. I mostly use Firefox but I'm pretty agnostic about which browser to use as long as it works and I can block ads and trackers adequately. So no I don't think there is a problem with browser speed as a general proposition. I do think people can do stupid things that slow down browsers significantly.

    • If you have hundreds of tabs open then you are Doing It Wrong.

      Why is that "Wrong"? Because your brain doesn't work that way? That's how I've always browsed and only recently did it become terrible.

      • by Allicorn ( 175921 ) on Wednesday December 21, 2016 @01:16PM (#53530889) Homepage

        "Wrong" is harsh, I agree, but I do wonder about folks that have so many tabs open. Isn't that what bookmarks are for? You're obviously not flipping between 100+ pages in the course of a single /task/. How often do you look at the 99th least frequent of those tabs? Is that frequency worth the resource load of having the page loaded vs loading the page on demand? Is it genuinely faster to find a tiny little tab in a - presumably - rather squashed cluster of 100+ than it is to find a bookmark on a menu and have it load or am I envisioning what 100+ tabs looks like completely, heh, well, "wrong", because I don't do it myself?

        Again, not saying "wrong" at all - just saying I'm one of those users that closes every tab after looking at the page. My average tab count over the course of a day's work is probably three - so having 100+ tabs just seems unfamiliar to me.

        Also, you say it recently became difficult. That's in a particular browser? Any idea what changed?

        • by npslider ( 4555045 ) on Wednesday December 21, 2016 @01:38PM (#53531145)

          I'd say it's a very individual thing.

          Some people think in a straight line, others more like a tree branch. The 'straight liners' only need to keep a few tabs open at a time to follow the single mental thread they are following. Those who quickly branch out and multitask will inevitably have numerous tabs open.

          It's like the case of the spouse who is talking on the phone, feeding the baby, cooking dinner, and folding laundry compared to the other spouse, having a hard time focusing on the football game while reaching for another beer...

          • by Whorhay ( 1319089 ) on Wednesday December 21, 2016 @03:18PM (#53532227)

            I'm pretty sure it's more about mental focus and self discipline. Our minds don't actually let us multitask, at best we switch tasks very rapidly. But I suppose that's being pedantic.

            My wife usually has dozens of tabs open when I look at her computer. She opens articles and such that she wants to read later in a new tab and moves on. I do the same thing when perusing my news feed. The difference is that I usually go and read those articles within a couple hours and close them out. She will stay occupied with other things and end up with tabs that she opened weeks ago still hanging there. When I have an article that I just don't have time for I'll bookmark it and close the tab. Youtube actually has a feature that I love and use the hell out of, and that is the 'watch later' button.

            Neither of us focuses particularly well. I just exercise some self discipline for up to 30 seconds per evening and don't end up with a computer bogged down by dozens of extra browser processes.

        • Re:Browsers are fine (Score:4, Interesting)

          by EETech1 ( 1179269 ) on Wednesday December 21, 2016 @04:20PM (#53532743)

          When I'm verifying electrical drawings, I often have over 100 tabs open.
          Imagine a machine with 20 different sensors. First you have the data sheets for the sensors open to find the pinout. Then you have one for the cable, perhaps two to find the wire color, then one for the mating plug that connects the cable to the junction box, then one for the junction box itself, then one for the cable that connects the junction box to the PLC card terminal block, then one for the PLC card that plugs into that terminal block. You can easily get six to eight tabs open for each sensor. Granted there may be more than one sensor to each junction box, but I often will keep duplicates of those tabs open so I do not have to find the other tab when working with a specific sensor to save time, double check my part numbers, and avoid looking at the wrong one.

          I will often have multiple browsers open, one for each subsystem, with hundreds of tabs open in each.

          Admittedly I may be odd, but it works better for me than printing them all out.


      • Why is that "Wrong"? Because your brain doesn't work that way? That's how I've always browsed and only recently did it become terrible.

        It's "Wrong" in the sense that just because you can do something it doesn't mean it is a good idea. Your brain doesn't work that way either even if you think it does. People are demonstrably terrible at multitasking. That's why we can't talk on the phone and drive at the same time safely. You cannot possibly convince me that you are doing anything productive with most of those tabs. It's merely a form of hoarding. Opening that many tabs chews up a ton of memory and then you complain that things get sl

    • I run hundreds of tabs in firefox and it runs fine. Things really only take a dump if I load tons of videos at the same time. But that is inane so I don't expect that to work well.

      But I don't see most of the problems described either. My tabs aren't all auto loaded when I start up firefox, they only refresh if I tell them too and they only load up at all if I return to them. I do shutdown my machine each day though, so its not like they're all loaded. There's no delay when I create a new tab view, its basic

  • by sciengin ( 4278027 ) on Wednesday December 21, 2016 @12:50PM (#53530531)

    Or as this article puts it more eloquently: []

    With open source software like Firefox it is more a failure of having the right people (engineers) at the right positions (the decision making ones). Instead they are left chasing the latest widget "feature" that no one ever asked for.
    For a long time I thought that the "standard" Firefox was already bad, until I switched to my Tablet 80% of the time: Firefox for Android is just plain torture. Multiple crashes every day. Most of the time when clicking on a text field or adress bar, FF apparently hates those. other times its idling and crashes for no reason. Not to mention its hunger for memory. Unfortunately it is the only browser there that has all the features I need via addons (Mostly adblock and noscript). Opera is much better, unfortunately its adblocking is faulty and it does not recognize my hardware keyboard at all.

    I remember my wikiwalks back in 2005-2007: I used to have 50-130 tabs open and nothing bad happened. That was on a Laptop from 2004 with a single core CPU. So really, it has gone downhill by orders of magnitude.

  • by jcr ( 53032 ) < .ta. .rcj.> on Wednesday December 21, 2016 @12:51PM (#53530545) Journal

    Check out the WebKit project, build it yourself, and run it under Xcode's profiling tools.


  • Look, a browser has to do a lot. A WHOLE LOT.

    1. Load page layouts, scripts, graphics, videos, & sound so that when you intentionally trigger something, your requested action happens *mostly* quick.
    2. Sites want to earn money. That means that non-mission-related stuff (advertising) must get loaded as well. This is often worse than the simple text that most of us are actually seeking out.
    3. Weed out malicious crap. Given the sheer amount of malicious crap out there, we're asking out browsers to do 9
  • by fuzzyfuzzyfungus ( 1223518 ) on Wednesday December 21, 2016 @12:59PM (#53530649) Journal
    There may well be some missed opportunities that I'm in no way qualified to comment on; but at least part of the "browser performance goes to hell when you have a zillion tabs open"/"it takes ages to restore a zillion tabs on restart" trouble seems likely to be connected to the fact that web pages have, somewhat awkwardly, evolved from being pretty much pure markup with a small sprinkling of javascript(which had the advantage, both for inexpensive background storage and quick restoration, of being nearly stateless, aside from any form fields that had been filled in; and could be stored or re-rendered from cached assets at your convenience); to being pretty much full applications, with lots of state all over the place; and often a variety of assumptions on the server side about what the client side is doing and vice versa. You can put such a page on ice; or dump it and re-render it from cached assets; but if you do, don't be surprised if you get dumped to a login screen or the page ignores most of what you've saved on its behalf and starts xmlhttp-request-ing anything that might have changed since it has been away.

    If you feed them retro websites, or sites designed by people with classic taste, contemporary browsers are faster than a bat out of hell with rocket boosters. That just doesn't help you against modern websites; which are both bloated(typically with slow-as-hell 3rd party ad code, just to add injury to insult); and still somewhat awkwardly trying to move from a caching model mostly designed to keep pictures from being unnecessarily retransmitted to something suitable for letting full applications pick up where they left off.
  • It used to be that you just had to be constantly mindful of the total data footprint of your page. These days pagers are much more complex and developers don't seem to give a shit about optimising. You load a page and it comes with of megabytes upon megabytes of images, flash adverts, it has embedded ads that delay the loading of the page because some ad server isn't responding, you get certificate warnings because somebody at Ads-r-us forgot to renew a certificate, pages are backed by business logic runnin
  • Deep stacks (Score:5, Insightful)

    by Bryan Ischo ( 893 ) * on Wednesday December 21, 2016 @01:05PM (#53530733) Homepage

    I don't know the specific code in question, but I have seen enough code to have a theory.

    Long-lived software projects that implement programs with complex features (and web browsers have an astounding array of features), especially those that interact directly with users, who put programs into a position of continually having to respond to extremely complex sets of inputs (just think of how many valid inputs there are to a web browser at any one moment), tend towards a style of implementation that can best be described as "layers upon layers upon layers of framework".

    One aspect of programming that many engineers are, in my experience, not very good at, is deciding when to simplify frameworks versus making them more complex. I think it comes from a fear that many people have of painting themselves into a corner -- we've all seen code so inflexible that it makes extending it at a later time difficult, and I think that many people respond by going too far in the opposite direction - to avoid painting themselves into a corner, they put 100 doors in every room so that there is always a way out.

    The end result is that every aspect of the complex program is designed to be extensible well beyond anything that will ever likely occur in practice. And the interactions between these complex layers of framework become so complex themselves that new layers have to be invented just to try to simplify things and allow any hope of rationally moving forward.

    So what you end up with is incredibly deep stacks of function calls for almost any action, as various extensibility layers are passed through, along with the layers that consolidate previously implemented extensibility layers into simplified layers that more directly match the actual requirements of the program.

    I have occasionally seen stack traces from programs like Firefox, and I expect Chrome is no different, and the depth of the stack at the point of a crash is always somewhat breathtaking. You may end up going 30 - 40 layers deep before you actually get to a piece of code that has a tangible effect on the state of the program.

    Now imagine that a particular user input requires running through a function that has to call out to several parts of another framework layer, and you're going to be paying the deep stack penalty multiple times.

    What you end up with is a large code base that does everything necessary, but in a way that is embellished to a nearly pathological degree, where every action takes 10 - 20 times longer than it would had that action been encoded much more directly and with far fewer framework layers.

    The advantage of such a large code base is that it has enough flexibility to rarely, if ever, require a complete redesign as new feature requirements come up.

    The disadvantage is that it will never operate as efficiently as a more directly coded program, and you get user interfaces that require executing literally billions of machine instructions just to effect fairly simple changes to its internal state machine.

    That's your web browser.

  • I really don't know why they killed Presto tbh ... it was faster (in real-time execution speed, which mostly means DOM performance, not useless JS benchmarks) than any other browser, and it took less resources. I mean heck, My workflow (on a 256 MB machine, mind you) was opening roughly 50 to 100 forum threads at once, and it didn't slow down perceavably. Try that on chrome or firefox and they die after the first 20 pages or so even if you have multiple gigs or RAM.

  • Sorry.. but something is seriously wrong if a 2014 iMac takes even a second to open a blank tab.

    I'm sitting at a 2009 iMac that opens blank pages nearly instantly.

    I call PEBKAC

  • Firefox opens new tabs instantly, not only after two seconds. If I hold down ctrl-T it pretty much keeps up with the key repeat rate. Switching between tabs is pretty much instantaneous as well. And when you restart your browser, Firefox will only reload any given tab when you click on it, and not try to load all of them at once. In other words, isn't the problem more that the OP is using the wrong browser? Or, alternatively, is running that browser on hardware that is just too low on memory (something that

  • Stop trying to surf the web with your toaster.

    But seriously, I expect the answer to performance issues has something to do with pleasing the people who complain about browser memory usage. If you open 15 web pages with tons of graphics and videos and whatever they're going to use a lot of memory. If your computer can't handle it all, something is going to give, and a performance hit has to happen SOMEWHERE as a part of that.

    When you open that photo, you're ONLY opening the photo. Easy. When you open a web p

  • by Lumpy ( 12016 ) on Wednesday December 21, 2016 @01:12PM (#53530839) Homepage

    Why are most websites so badly designed and so bloated they load slow?

    Slashdot before 2.0 was screaming fast. Then they introduced all the JSON garbage that slowed it down. It's all about being pretty over functional.

  • by gestalt_n_pepper ( 991155 ) on Wednesday December 21, 2016 @01:22PM (#53530939)

    As long as web sites pack every possible space with bandwidth sucking animations. As long as refresh is used with wild abandon by incompetent web designers, browsers will appear slow.

  • by Billly Gates ( 198444 ) on Wednesday December 21, 2016 @01:33PM (#53531099) Journal

    Just saying ...

  • First Post! (Score:5, Funny)

    by PPH ( 736903 ) on Wednesday December 21, 2016 @01:51PM (#53531293)

    What do you mean, 'slow'?

  • by Miamicanes ( 730264 ) on Wednesday December 21, 2016 @02:17PM (#53531567)

    Much of the recent blame should be placed on the new universality of using SSL/TLS for everything.

    The problem is, negotiating a cold SSL/TLS handshake from scratch takes a certain amount of time... and there's very little you can do to speed it up with current standards. Adding certificate-revocation lookups compounds the problem and adds even more time. In the past, a site that "used SSL" might need to do one or two key exchanges for one or two different hosts. Now, a single page might require a dozen or more key exchanges... and some of those key exchanges might not even be known by the browser to be necessary until after it's done the first few (because some script delivered via https references some other asset via https on a different host).

    SSL/TLS itself is generally a good thing... but the WAY it's currently being used was NEVER seen as a realistic use case 20 years ago when it was first implemented. Simply put, SSL/TLS key exchange and CRL doesn't scale well in their current form.

    This is also part of the reason why it's so common to end up with "five bars of LTE signal strength, but no working data-connectivity" -- mobile apps are "chatty" to begin with, and many of them do a shit job of keeping https sessions alive between requests, so every single background https request requires yet another new handshaking and revocation-lookup. In areas where you have a cell site with limited backhaul connectivity and occasional surges in the number of users (say, a cell tower near a large state park serviced by only a T-1 line or two... on July 4th, when 60,000 people might descend on the park for a few hours), the endless key exchange traffic ITSELF can almost fully-saturate the tower's backhaul capacity.

  • by zifn4b ( 1040588 ) on Wednesday December 21, 2016 @02:54PM (#53532009)

    Ok, I am very astonished by this question "Why are browsers so slow?" which since you're most likely a millenial you might as well re-phrase the question as "Why are browsers slow AF?"

    No offense but you obviously have no idea what a slow browser really is. Try Netscape 6 or IE6 and let me know if you think Chrome is still slow. First of all, are you sure you can attribute all the slowness to the browser itself? Did you crack open the modern browser developer tools that we all have now (hint: Firebug and Chrome developer tools didn't used to exist until a few years ago) and look at the network tab or equivalent to make sure that the web server/REST service/whatever isn't taking a long time to serve back data? Better yet try profiling your own javascript code with console.log, console.time/console.timeEnd. Since you can use an identifier, you can even do it asynchronously, how fancy is that? I seriously doubt you've done any of these exercises because most modern browsers take javascript, compile it into native code and cache which is about the fastest you're going to get javascript to run. Microsoft completely rewrote it's rendering engine multiple times the latest of which is Edge. Firefox, Chrome and Safari have all had similar efforts. I've been a web developer for a number of years and I can tell you, the stuff we have now is lightning fast.

    You try serving up a web page to Netscape Navigator using a cgi-bin perl script from an old version of Apache and let us know if you think modern tech is still slow.

    It just appalls me. No matter how much better we make technology there is always a generation that comes along and tells us all of our effort are crap because they think everything needs to be bigger, better, faster. If you want to complain, you get engaged and make it what you think it ought to be. Until then you have no room to complain if you're just going to comment from the peanut gallery. Gah.

  • by WaffleMonster ( 969671 ) on Wednesday December 21, 2016 @04:52PM (#53532969)

    One consistent performance problem I see across browsers is large tables take forever to render even with static column attributes. You can load an excel spreadsheet with tens of thousands of rows instantaneously.

    The same data in a table pegs a core and consumes a GB or more of RAM with minutes or more delay. Often as the number of rows increase there is a huge corresponding drop-off in performance until it becomes practically unusable.

    Otherwise I have no real issues with browser performance. Sure browsers like IE do the darndest things you've ever seen sometimes including getting so slow as to be unusable when display large ASCII text files. That's right.. not complex HTML but text rendered as TEXT.

    Browsers are like everything else. The EE's give us incredible hardware and we go out of our way to waste it with lazy sloppy coding simply because it's cheaper not to care.

    Browsers are the same way.. all kinds of hard work on fancy algorithms and optimizing performance ... obliterated by "developers" who think HTML/JavaScript/CSS are "too hard" and instead insist on piling on layer upon layer, framework upon framework, widget after widget, piecemeal XMLHttpRequest after XMLHttpRequest, fill site with social media bugs, cross site trackers, ads/malware, use third party bugs to get web stats because you are too lazy to install a stats package to analyze your own access logs. Some sites just enjoy selling out their customers to stalking firms who often cross sell-out customers to other stalking firms leading to hilarious trees of connections when accessing a single page. Yet others are completely oblivious to what is going on behind the scenes.

    Web sites written by people who care about wasting their users time load instantaneously. It's amazing. Never really had issues with browsers themselves not being responsive divorced from the bullshit occurring within them. I'm not really sure how what is being described by TFA is even possible given so much runs in separate process/memory space these days.

  • by j2.718ff ( 2441884 ) on Wednesday December 21, 2016 @04:56PM (#53532993)

    I think I found your problem:

    Which takes ages if you have a hundred of tabs.

    I can't imagine my browser being remotely functional if I had hundreds of tabs open.

  • by Kernel Kurtz ( 182424 ) on Wednesday December 21, 2016 @06:13PM (#53533479) Homepage

    Even with no-script (or especially with it as may be the case), I try to load a page. It partially works, but needs some scripts to work properly. I tell no-script to allow all this page, but then when it reloads it wants even more scripts from more places. Tell no-script OK again, and then when it reloads, it still wants more, or some of them change.

    Sometimes takes 3 or 4 tries allowing and reloading just to get pages to render by the time I've approved the 50 other sites they want to load content from.

    Generally speaking, I try to avoid websites that go overboard this way, but it is sadly getting to be way too common. Sometimes the no-script list is so long I have to scroll through it.........

"The pyramid is opening!" "Which one?" "The one with the ever-widening hole in it!" -- The Firesign Theatre