



Slashdot Asks: Why Are Browsers So Slow? (ilyabirman.net) 766
Designer Ilya Birman writes: I understand why rendering a complicated layout may be slow. Or why executing a complicated script may be slow. Actually, browsers are rather fast doing these things. If you studied programming and have a rough idea about how many computations are made to render a page, it is surprising the browsers can do it all that fast. But I am not talking about rendering and scripts. I am talking about everything else. Safari may take a second or two just to open a new blank tab on a 2014 iMac. And with ten or fifteen open tabs it eventually becomes sluggish as hell. Chrome is better, but not much so. What are they doing? The tabs are already open. Everything has been rendered. Why does it take more than, say, a thousandth of a second to switch between tabs or create a new one? Opening a 20-megapixel photo from disk doesn't take any noticeable amount of time, it renders instantaneously. Browsers store their stuff in memory. Why can't they just show the pixels immediately when I ask for them? [...] Unfortunately, modern browsers are so stupid that they reload all the tabs when you restart them. Which takes ages if you have a hundred of tabs. Opera was sane: it did not reload a tab unless you asked for it. It just reopened everything from cache. Which took a couple of seconds. Modern browsers boast their rendering and script execution performance, but that's not what matters to me as a user. I just don't understand why programmers spend any time optimising for that while the Chrome is laughably slow even by ten-years-old standards.Do you agree with Birman? If yes, why do you think browsers are generally slow today?
Why they are slow? (Score:5, Insightful)
Because we need a canvas-blocker add-on and a flash-blocker add-on and a cookie warning remover add-on and add-ons to remove all the ads on the page and we need add-ons to remove the dozens of trackers to protect our privacy and also add-ons that remove all the social buttons (twitter, FB, etc) which share our behavior even if we are not a member and add-ons that remove all the javascripts that load popups, do unwanted refreshs and Greasemonkey to make some pages readable and remove the ads that are inside the articles and we need an add-on to circumvent the anti-adblockers and .....
Re:Why they are slow? (Score:5, Interesting)
Pretty much this.
I feel that so many websites have gone to insane levels on ad loading that the web has become almost unusable by some standards.
Yes, I say that with at minimum, 12 tabs open, but most problems I have are around specific ad revenue driven, ad block loathing sites.
Re:Why they are slow? (Score:5, Informative)
Re:Why they are slow? (Score:5, Interesting)
The idea of not hosting your own local copy is that if everyone uses a common source for the library, then the browser should already have that resource cached even if it's the first time you hit the page. For large enough libraries that the Google CDN (or another CDN) includes it, it's worth it.
https://developers.google.com/... [google.com]
https://www.asp.net/ajax/cdn [asp.net]
https://www.keycdn.com/support... [keycdn.com]
Now, for obscure libraries, you're better off hosting your own so that you can control the version and not worry about the referenced script changing/moving/breaking stuff.
Re:Why they are slow? (Score:5, Insightful)
You're wrong. I've been doing this for three decades now. If you're serving a page to a client, sending them to a third-party to get the library always sucks. Even if they've already got the library disk-cached, it's actually slower to access the disk cache, and check the cache age, and verify that there isn't a newer library version (did you know the browser often goes round-trip just to check?) than it does to simply serve the library in-line.
Serving the library yourself can still run in parallel, and it often reuses the same primary connection, so it's about as fast as your server can handle.
Now you're going to mention the browser's memory cache, instead of disk cache. First, nowadays, with each-tab-in-a-separate-sandboxed-process, those memory caches ain't as fast as they once were. But even when they are, you just ain't a'gonna beat in-line scripts.
Benchmark it yourself. Serve 100KB of javascript in-line, in the middle of your html file. Compare that to a separate src= js file. Mid-stream, in an HTML file of another 100KB, the javascript runs at full download speed, with full text transmission compression. Those 100KB easily compress down to 50KB, more often 25KB, mid-transmission, and at any modern residential bandwidth, you're talking about the tiniest of fractions of a second. No disk access, no file handles, no separate rendering processes, no sandboxes, and, most importantly, no virus scanners, no swapping.
But it does eat up your server's bandwidth costs.
If you're huge, amazon style, then you want to off-load cycles and bits anywhere you can. If you're not huge, then the added 10% bandwidth costs mean nothing to you.
Re:Why they are slow? (Score:5, Interesting)
You're talking about fast, as seen by the user. They don't care about that at all - they're optimizing for fast as seen by the server: less bandwidth. Anything they can offload is a win, no matter how much the user experience sucks.
Re: (Score:3)
You're wrong. I've been doing this for three decades now. If you're serving a page to a client, sending them to a third-party to get the library always sucks.
I agree, partly for the same reasons and partly for different reasons.
Extra lookups to remote sites suck- more time, more misdirection, etc.
You run the risk that the remote site has been compromised and is now serving up something malicious.
Similarly, you run the risk of MITM stuff taking place.
If they upgrade some file and it introduces an incompatibility it may very well break something on your site.
If the remote site is down, guess what? So is your site.
I always host my own library files, period. GeoIP t
Re:Why they are slow? (Score:5, Insightful)
I really don't mean to attack you personally (really, I don't) but your post is a great example of a huge and growing problem in society in general: people who are very very sure of themselves, write well, sound authoritative, but are WRONG.
Most, if not all, of your argument is based on fast network. Far too many programmers these days just assume huge network bandwidth. They think nothing (MS) of starting a "background" update process which can try to download GIGABYTES of unwanted crap.
News flash: not everyone has 10GB Internet, Core i7 or better CPU, 16 GB RAM, 1 TB SSD, etc.
And stop the arrogant attitude of telling everyone they're using dinosaur computers, to "upgrade" everything, etc. By expecting everyone to be on fast Internet, etc., you're shutting out a chunk of your market. I've stopped using many major news sites because a redesign not only looks terrible (if it's readable at all) or takes so long to load it's a waste of my time. The advertisers need to know this, right?
My poor mom bought a "smart" phone (Android) 18 months ago. Several well-meaning (over-confident, sure they know everything tech) "helped" her by installing many apps. She did not buy a huge data plan, and in 1 month was dozens of times over her data limit- just in the apps constantly 1) phoning home, and 2) updating themselves. The provider was kind enough to forgive her and she gave me the phone.
And while I'm at it, how about an International law stating that NO website will ever be allowed to download, stream, or otherwise hog network bandwidth without the users 100% knowledge AND permission. Example: I'm trying to read a news article, and an unwanted window starts playing a video without my request nor permission.
How about ALWAYS, with NO exception, asking the user how much bandwidth they are OK giving up? How about testing network bandwidth BEFORE doing huge downloads? And don't forget latency while you're linking nested .js from around the world.
About 60-70% of my browsing is done on Old Opera (11.x) because you would be stunned at how much faster most websites are with javascript OFF. I know, all too well, how many websites barely work, or not at all, with javascript turned off. I've studied the code and I see they render in javascript to force you to turn it on because their trackers and ad popups need javascript.
Re:Why they are slow? (Score:5, Informative)
My first e-mail address had a comma in it, and token rings sucked for a lot of reasons. That's how long I've been developing web-sites.
But I didn't say I've been building web-site for three decades. I said, quite obviously, that I've been serving pages to clients for three decades.
You may be way too young to understand the difference between real-world practical and academic history, but the world wide web was not the start of the internet, the internet was not the start of networking, and networking was not the start of serving pages to clients.
Caching's been around for a very long time. If you want to learn about the benefits of caching, and the pitfalls, you want to look at archive caches, not transmission caches. When access involves an elevator, or a truck, you quickly learn what does and does not make sense.
Here's a perspective for you. When accessing a cache from a warehouse two miles away involves a truck, you get to consider the effects of rush hour traffic. So when your third-party server caches your javascript file, you get to consider that it isn't geographically in the same place as your primary server, meaning that not only does your client need to hit "another" server, with another connection, another keep-alive, and another set of caches, but it also needs to get there, through an ISP channel full of traffic. You've just doubled the amount of traffic globally. You can't ever bet on it being fast. You can only hope.
So, of course, you have a basement cache of your warehouse cache -- i.e. the disk cache. Elevators don't have traffic; or so you say. But the warehouse is full -- always, because that's what a warehouse is. So you get to search stacks and shelves and indexes. You get to have organizational training, and inventory days. Welcome to the magic of disk thrashing. The file table is incredibly slow in terms of file handles. Welcome to WAD files. Defragging is the inventory day, by the way.
So now you've got your reliable cache, that's only reliable under minimal load. Perfect. But some things get accessed often. So you keep a copy upstairs in the filing cabinet. But the filing cabinet is small. So you shuttle different cabinets up and down per day. Today you need the green cabinet. Tomorrow, the red. And life is good. Welcome to swap files -- memory to disk and back.
Now you've got employees shuttling cabinets, with dollies, and elevators, and warehouses, and trucks, and traffic. And here's the kicker. You haven't eaten yet! You've got a dozen staff, countless duplicate copies of files, trucks, buildings, elevators, desks. But you haven't done a lick of revenue-generating work yet.
Re: (Score:3)
Google doesn't care how long it takes your page to load. It doesn't know how important the page is, so it could never judge two pages based on size alone. What google very-much does care about is how long it takes your page to render "something". Put your scripts at the end of the file. Output your logo very quickly, before you do any real server-side work. I promise the top one inch of your web-site can be the very same all the time.
Re:Why they are slow? (Score:5, Interesting)
Dell machines are usually designed to be quiet. That's why things are slow. But that's a different story.
You aren't working cradle-to-grave. hitting the local disk isn't about the disk speed. Those 15ms can be zero for this discussion. Similarly, the ping to the site of 125ms is meaningless. Here's what happens, cradle-to-grave:
Scenario 1, live streaming:
connect to site: 125 ms
download 200KB of html and javascript, compressed to 100KB, 1s
Scenario 2, external file:
connect to site: 125ms
download 100KB of html, compressed to 50KB, 1s
download 100KB of javascript, compressed to 50KB, 1s
Scenario 3, external file cached:
connect to site: 125ms
download 100KB of html, compressed to 50KB, 1s
wait for disk to be available, access disk for 100KB of javascript, wait for disk to spin up, 15ms
wait for virus scan, read cache meta data, determine that our cache file isn't already too old, 1s
HEAD the web-site file, to compare meta data, check to see that our cache is good enough, 1s
-- look at all of the side-work involved, and we're still hitting the site much of the time to check the cache anyway!
Scenario 4, external site
connect to site: 125ms
download 100KB of html, compressed to 50KB, 1s
connect to second site: 125ms
download 100KB of javascript, compressed to 50KB, 1s
Scenario 5, external site, cached
connect to site: 125ms
download 100KB of html, compressed to 50KB, 1s
wait for disk to be available, access disk for 100KB of javascript, wait for disk to spin up, 15ms
wait for virus scan, read cache meta data, determine that our cache file isn't already too old, 1s
connect to second site: 125ms
HEAD the web-site file, to compare meta data, check to see that our cache is good enough, 1s
-- now we have side work, a second connection, traffic everywhere
And all of this gets even worse when you realize that the connection to your primary site is keep-alived for your entire visit -- minutes at a time -- but to your secondary site, you're lucky to get a full second. Add all of the other hardware that you're using, in terms of disk, cpu, virus scanners. Now add the entire local permission system, cooling system, throttling systems, battery savers, and you're now conflicting with everything else running in the background.
Re: (Score:3)
You also [have] no control over transparent caching proxy servers used by your ISP,
Yes, you do. It's called https.
Far-future Expires: header (Score:3)
What you are missing is that even if the browser has the library cached, it still much reach out to the server to see if it has the latest version
Not if the version number is in the URL. Then the CDN can serve the library with an Expires: header with a value years in the future.
Re:Why they are slow? (Score:5, Interesting)
If you host your own; the user ends up downloading a copy even if it is identical to the one they pulled from somebody else for a different site that uses the same library. If you avail yourself of Google's...generous...hosted offer, the user is more likely to cache; but then every load of your page also involves a chat with Team Mountain View.
Ideally, you'd specify that you use JSLibXYZ, with cryptographic has and/or signature specified, and at least one source for a client who doesn't have it to get a copy; but the user's browser could make use of JSLibXYZ from any source, so long as the hash and signature match, since that provides assurance that it's the same thing. Caching mechanisms based on domain of origin are sensible enough for assets that are likely to be site specific(pictures, stylesheets, etc.); but, like it or not, giant javascript libraries, many shared across a large number of otherwise unrelated sites, have a lot more in common with dlls than with images; and it would be handy if they were treated as being a different case. With hashes and/or signatures, you could ensure integrity even in cross-site sharing situations; allow caching to work regardless of how the site operator has set up script hosting, and end the choice between 'serve yourself, lose out of caching/let google do it, get caching but add a 3rd party to everything you do'.
Re:Why they are slow? (Score:5, Insightful)
>somewhat dysfunctional
DOOM was 16MB. A whole game in 3D with full on-the-fly animation and sound effects and music. That was amazing for the time when Myst was a game that served up mostly static images.
A web page mostly serves up text, static images, and sometimes video.
Yet this needs 16MB to be downloaded to be functional, somehow.
>somewhat dysfunctional
Try completely dysfunctional
Oh, and to the assholes who put in autoplay video that you can't turn off/pause: die. in. a. fire. Your pages actually last 5 seconds in my browser tab, and I'm sure this is the other case with people who are either on mobile, or just simply want to be quiet.
--
BMO
Re:Why they are slow? (Score:4, Informative)
That should have a "less than" sign in front of the "5 seconds"
--
BMO
Re:Why they are slow? (Score:5, Funny)
Just imagine every wall in that doom game filled with dynamic ads for health insurance (before you are fragged of course), places to comment on which weapon you like better (to be posted to social media)... oh and the whole game is just a conglomeration of Adobe flash videos.
Re: (Score:3)
*runs away screaming"
--
BMO
Re:Why they are slow? (Score:4, Insightful)
It isn't at all elegant; but unless someone discovers a way to churn out good programmers with unprecedented efficiency, I can't say too many mean things about tools that let terrible programmers get merely bad results, since we appear to demand more software than we have talent to supply.
Re:Why they are slow? (Score:5, Interesting)
It's not just ads; a lot of websites pull in JS helper scripts from other sources (instead of hosting local copies of their own). And those sources do not always have the best performance.
What we should do is get the web standards updated to add a Rule:
Remotely-loaded scripts (Loading from a different server) must specify a SHA256 sum in the Script tag such as
<script src="https://example.com/mycode.js" integrity="sha256-7d774a8ff0e73f2791c3a12dfc3ef1f9a1a640d470584b9b9222d395e8519fc5">
If the hashcode is not specified, then the script will not be run.
If the hashcode IS specified, then it can be cached by any 3rd party.
The URL becomes "Advisory", and Popular hashcodes can be distributed by Amazon, Google, or your local ISP.
There should be a way to have a DNS suffix DNS record to specify local object-caching servers that can be queried by code.
Caching is permanent. An outage of the original source server has no affect.
Re:Why they are slow? (Score:5, Informative)
Using a good, fast third party source
LOL
I see what you did there.
Also, I just clicked on AdBlock Plus --> Open Blockable Items. There are 31 scripts on this page.
31 MOTHERFUCKING SCRIPTS ON ONE PAGE!!! You people are insane.
Re:Why they are slow? (Score:5, Informative)
41 MOTHERFUCKING REDIRECTS ON ONE PAGE!!! You people are insane.
Re:Why they are slow? (Score:4, Informative)
The web is unusable. When loading any website containing content intending to be read (news article, blog post, &c.), I poise my mouse over the "reader mode" button and stab it the very instant it appears.
I will not even try to read content on pages with moving shit in the side-bars, dropping-down menu bars, random crap that unpredictably overrides scrolling, popups sliding up and over and all around the fucking place, sometimes minutes after the page finished loading because they know that's how they can irritate you into looking at it.
If the reader mode button doesn't show up, I don't even bother trying to use the page. I just close it.
Re:Why they are slow? (Score:5, Insightful)
Chatty scripting requiring lots of back and forth. Serving up the ads first before content, trying to make sure the ads are seen. Interacting with ad customization services to eke out an additional fraction of a cent. Scripting that downloads higher quality video in a sidebar ad than the real site. Flash or other crap that runs at CPU max load rather than 99% of it, because the browser makers want that fastest script award and to hell with lockup.
Inability to RTFA (Score:3)
And then end up getting modded down for making uninformed comments on account of not having read the featured article. Several sites admit that they can't tell tracking blocking (such as privacy.trackingprotection.enabled [mozilla.org] which uses Disconnect's list) from ad blocking, such as WIRED, The Atlantic, and the INQUIRER, and sites like Slashdot continue to link to stories on these sites.
Re: (Score:2)
Safari runs a web page as it's own process for security reasons.
Every new window starts up a process. Open process viewer, filter on Safari, the just create empty tabs. You'll see the process list grow.
Re: (Score:2)
But how much memory can these processes share, so that duplicate content doesn't spill out of RAM and into swap? Reference counting turns copy-on-write into copy-on-read, as Benjamin Peterson explains in a talk about CPython's garbage collector [youtube.com].
Re: (Score:3)
Bingo. It's actually well under a millisecond for fork + exit on current hardware. More when you add exec including dynamic linking, but still on the order of milliseconds. Much, much faster than you could blink your eyes. It is completely specious to suggest that thread-per-tab or process-per-tab has any detectable effect whatever on real-world performance.
Re:Why they are slow? (Score:5, Informative)
OK, I just benchmarked 1000 forks in C++ on the lowest-end desktop Sandy Bridge Core i3-2120T running at a reduced frequency of 1.6 GHz. I got 89,130: a whopping 89.13 microseconds per fork.
#include <chrono>
#include <iostream>
#include <unistd.h>
#define N 1000
int main() // std::cout << pid << '\n';
{
auto start = std::chrono::system_clock::now();
for (int i = 0; i < N; ++i)
{
int pid;
if (!(pid = fork())) {
exit(0);
}
}
auto stop = std::chrono::system_clock::now();
auto elapsed = std::chrono::duration_cast<std::chrono::microseconds>(stop - start);
std::cout << elapsed.count() << " microseconds for " << N << " forks\n";
return 0;
}
Re:Why they are slow? (Score:4, Informative)
Forking a non-trivial program may incur more practical lag than forking a trivial program, as more copy-on-write pages have to get duplicated once the forked process starts running.
Re: (Score:3)
fork() exploits the MMU by marking every page in the parent process as readonly, then sharing these pages with the child process. There is no immediate copying of memory.
That's not quite true: at the very least, the top stack frame will be copied immediately. That set of MMU updates also isn't cheap on a modern multicore system (you have to flush all of the relevant pages from all of the TLBs). More importantly, there's a lot of in-kernel state for a process (e.g. file descriptor tables) that gets duplicated when you fork a process.
KISS Compliant [Re:Why they are slow?] (Score:2)
The amount of crap most commercial sites put into their pages is amazing. It's not only slow, but a security risk.
A non-profit organization can set up a "K.I.S.S." standard and create a minimalist browser. If a site works fine in the KISS browser, they can place a logo on their site to advertise they are KISS-compliant (or have a KISS-compliant alternative site/page, which the KISS browser would automatically redirect to.)
It's kind of like the concept behind Underwriter's Laboratories. [wikipedia.org]
KISS wouldn't allow JS
Re: (Score:2)
Those add-ons are supposed to make the browser faster. I'm running Chrome on a fairly unremarkable machine with the many add-ons, and new tabs open instantly, a fraction of a second, as fast as I can ctrl-T and start typing an address/search query.
Add-ons include:
Feedly
Google Docs
PDF Viewer
Privacy Badger
Rikaikun
SmoothScroll
Tampermonkey
uBlock Origin
+ many more
I think the questioner's machine is either crap or bogged down with malware.
Re: (Score:2)
This.
I am shielded from within the browser so much that I have to grant temporary permission to take one step forward.
Looking at the stuff that's trying to load, the browser is doing some heavy lifting.
In the old days (think Netscape) the browser loaded and just sat there with no home page, even.
-
Please allow me to step outside the lane here and poise to ponder how much faster and cleaner DOS would be if we had modern technology running a character-based system with no ads. In fact, Netscape [lctn.com] had that featu
Re:Why they are slow? (Score:5, Insightful)
It's like watching a 7 minute television program in the space of an hour, divided up every 5 seconds by commercial breaks.. and trying to record the whole thnig sans commercials on an old-school VHS deck.
Re:Why they are slow? (Score:5, Informative)
Stop instructing your browser to block things. You don't need an add-on to define network connections. Start telling your network stack where to not find them:
HOSTS-level blocking
http://winhelp2002.mvps.org/ho... [mvps.org]
And if that's not enough, look into PAC files. You won't be disappointed.
Re:Why they are slow? (Score:5, Funny)
Huh - this didn't summon APK. Maybe he's taking a week off for Christmas.
Re: (Score:3)
He might just be submitting his latest $100 forum post [slashdot.org]. That $100 he got 8 years ago has stretched far enough, daddy needs a new sock.
Re:Why they are slow? (Score:5, Funny)
What's the addon that removes all the socials buttons? I need that one.
Re:Why they are slow? (Score:5, Informative)
Opera is NOT sane. (Score:5, Insightful)
>Opera was sane: it did not reload a tab unless you asked for it. It just reopened everything from cache
No. That is NOT sane, normal, or desired.
Webpages are live. If I want to look at a chached version, I'll save the webpage locally. If I'm pointing a browser at an address, I expect a current webpage. If it takes all of three seconds to give me that, I think that's okay.
Re: (Score:3, Informative)
Re: (Score:2)
Re: (Score:3)
Re: (Score:2)
Webpages are live. If I want to look at a chached version, I'll save the webpage locally.
Which extension do you use to save locally all pages that are open in a dozen or so tabs, so that they can be reloaded back into tabs once you are ready to read them? Or do you need to save each page individually, navigating through your file system to find a folder in which to save each?
Re: (Score:2)
Which extension do you use to save locally all pages that are open in a dozen or so tabs, so that they can be reloaded back into tabs once you are ready to read them?
Scrapbook Plus [mozilla.org]
Sometimes, web pages should NOT be reloaded (Score:2)
I do development where I need to look at the source code of a page, to make sure that certain elements are present. The attitude that caching the page for even a fraction of a second is "bad" has made it very difficult to do this anymore - If I ask for the source, the browser claims the page is "expired", and demands to reload it. That changes what I'm looking for, making the request invalid.
Even trying to save the page locally requires that it be reloaded. It's become nearly impossible to check a page's ac
Re:Opera is NOT sane. (Score:5, Informative)
Re: (Score:2)
Did you try right-clicking the Back button to see a list of steps that you can skip?
This totally breaks the rendering speed. (Score:5, Informative)
I agree with the summary. Basically, we have faster engines for rendering for HTML and JS, but the UI is really slow. IE family is the worst in this aspect.
On Firefox (my default browser), a nice boost in tabs rendering is made turning off the tab animation: changing "browser.tabs.animate" to false in about:config.
Re:This totally breaks the rendering speed. (Score:5, Interesting)
The UI is not slow.
What's slow is the fact you're loading MEGABYTE APPLICATIONS full of commands through the web that have to be processed before display.
Go to a 90's era website with HTML3.2. Watch how magically fast everything is displayed.
Check out the old website, "Internet Explorer is Evil":
http://toastytech.com/evil/ [toastytech.com]
Watch how magically fast the animated GIF backgrounds display. How magically instantaneous the text renders. Why? NO CSS. NO JAVASCRIPT. No 5 MB app being downloaded.
And even then CSS and JS don't have to be slow. But people designing these websites simply don't give a shit about speed. Have you ever tried to load a Google Document on a Netbook with an Atom processor? Apparently a dual-core CPU with 2 GB of RAM isn't enough TO DISPLAY TEXT and if you try and multi-task it'll bring your computer to a complete halt. I've literally had to panic close tabs in Linux with my netbook with Google Docs open with other tabs, because if it starts getting laggy, it'll quickly get to the point it freezes the whole computer (>15 seconds between mouse moving) and it's actually faster to reboot the entire thing. (SSD ftw.)
Browsers are NOT slow (Score:5, Informative)
It's web pages that are filled with useless javascript libraries that people think they still need to use for cross-browser support even though Internet Explorer is long dead.
It's web pages that are filled with useless ads that run their own scripts, sometimes with their own libraries too, fetched from multiple servers.
It's web pages that are filled with useless tracking scripts, sometimes with their own libraries too, fetched from multiple servers.
It's web pages that are filled with huge animated GIFs that should be in video form instead.
It's web pages that are filled with auto-loading, auto-playing videos, jamming our connection to download something we don't even want to see instead of downloading the web page we're trying to read.
Disable plug-ins. Disable javascript. You'll see how fast browsers really are.
Re:Browsers are NOT slow (Score:5, Insightful)
I am trying to figure out what part of what you wrote (which is all true, mind you) explains why opening a new tab in Chrome or Firefox has to take as long as it does.
Re: (Score:3)
Opening the tab is also more or less instant for me.
Then a couple of seconds later the search bar appears.
Three to five seconds later the snapshots of the most commonly visited websites.
I have no idea how displaying one large image, a text input field and eight (presumably locally saved) thumbnails can take upwards of ten seconds.
Why make them faster when most people... (Score:4, Insightful)
obviously don't care about painfully sluggish programs. Just look at the number of people that find Windows acceptable despite being horrifically slow. If you can get away with inefficient and poor code, why spend the extra money to hire decent programmers to make something not slow. There's no reason why Windows should take tens of minutes after boot before it becomes usable or why clicking on a Windows should usually take more than a full second, but because people keep giving Microsoft piles of cash for giving us crap, they have no reason to fix it.
They're not (Score:5, Insightful)
Possibilities:
* You have too many slow addons enabled
* You are out of RAM
* The page is slow (big/complex)
* Your network connection is slow/saturated
* Their network connection is slow/saturated
* You are out of CPU (unlikely)
Re: (Score:2)
Given what we ask them to do, browsers (even crappy ones, -cough-IE-cough) are remarkably fast.
If you are seeing 'slow' performance, try analyzing it objectively. Obviously, another machine with a clean browser environment would be a good starting point for comparison.
As far as "hundreds of tabs", well... maybe the first step should be to
Get.
A.
Clue.
Browsers are fine (Score:5, Interesting)
Do you agree with Birman? If yes, why do you think browsers are generally slow today?
Not really, no. Tabs open up basically instantly on the computer I'm typing this with. Doesn't matter much which browser I'm using either. I'm almost always limited either by the bandwidth of my internet connection or the slowness of the database on the other end of the line. I hear people bitching about this browser or that one being "slow" and I frankly have no idea what they are talking about. If you have an even vaguely recently built computer with reasonable hardware then it is a non problem. I also see comparisons between the browsers which claim this one or that one is faster and I simply don't see any meaningful difference. The only difference between them to me is the user interface and what bugs I run into. If there are speed differences they are simply too small for me to care.
Now he did say "hundreds of tabs" which is probably the root of the problem. My guess is that he's overtaxing his computer and running out of RAM. If you have hundreds of tabs open then you are Doing It Wrong.
I have a 2012 vintage Mac Mini and it runs just fine. Safari, Firefox, Chrome, doesn't matter. I'm typing this on a PC with a Intel i5 chip and an adequate amount of RAM and it's fine no matter what browser I use. I mostly use Firefox but I'm pretty agnostic about which browser to use as long as it works and I can block ads and trackers adequately. So no I don't think there is a problem with browser speed as a general proposition. I do think people can do stupid things that slow down browsers significantly.
Re: (Score:2)
If you have hundreds of tabs open then you are Doing It Wrong.
Why is that "Wrong"? Because your brain doesn't work that way? That's how I've always browsed and only recently did it become terrible.
Re:Browsers are fine (Score:5, Insightful)
"Wrong" is harsh, I agree, but I do wonder about folks that have so many tabs open. Isn't that what bookmarks are for? You're obviously not flipping between 100+ pages in the course of a single /task/. How often do you look at the 99th least frequent of those tabs? Is that frequency worth the resource load of having the page loaded vs loading the page on demand? Is it genuinely faster to find a tiny little tab in a - presumably - rather squashed cluster of 100+ than it is to find a bookmark on a menu and have it load or am I envisioning what 100+ tabs looks like completely, heh, well, "wrong", because I don't do it myself?
Again, not saying "wrong" at all - just saying I'm one of those users that closes every tab after looking at the page. My average tab count over the course of a day's work is probably three - so having 100+ tabs just seems unfamiliar to me.
Also, you say it recently became difficult. That's in a particular browser? Any idea what changed?
Re:Browsers are fine (Score:5, Insightful)
I'd say it's a very individual thing.
Some people think in a straight line, others more like a tree branch. The 'straight liners' only need to keep a few tabs open at a time to follow the single mental thread they are following. Those who quickly branch out and multitask will inevitably have numerous tabs open.
It's like the case of the spouse who is talking on the phone, feeding the baby, cooking dinner, and folding laundry compared to the other spouse, having a hard time focusing on the football game while reaching for another beer...
Re:Browsers are fine (Score:4, Insightful)
I'm pretty sure it's more about mental focus and self discipline. Our minds don't actually let us multitask, at best we switch tasks very rapidly. But I suppose that's being pedantic.
My wife usually has dozens of tabs open when I look at her computer. She opens articles and such that she wants to read later in a new tab and moves on. I do the same thing when perusing my news feed. The difference is that I usually go and read those articles within a couple hours and close them out. She will stay occupied with other things and end up with tabs that she opened weeks ago still hanging there. When I have an article that I just don't have time for I'll bookmark it and close the tab. Youtube actually has a feature that I love and use the hell out of, and that is the 'watch later' button.
Neither of us focuses particularly well. I just exercise some self discipline for up to 30 seconds per evening and don't end up with a computer bogged down by dozens of extra browser processes.
Re:Browsers are fine (Score:4, Interesting)
When I'm verifying electrical drawings, I often have over 100 tabs open.
Imagine a machine with 20 different sensors. First you have the data sheets for the sensors open to find the pinout. Then you have one for the cable, perhaps two to find the wire color, then one for the mating plug that connects the cable to the junction box, then one for the junction box itself, then one for the cable that connects the junction box to the PLC card terminal block, then one for the PLC card that plugs into that terminal block. You can easily get six to eight tabs open for each sensor. Granted there may be more than one sensor to each junction box, but I often will keep duplicates of those tabs open so I do not have to find the other tab when working with a specific sensor to save time, double check my part numbers, and avoid looking at the wrong one.
I will often have multiple browsers open, one for each subsystem, with hundreds of tabs open in each.
Admittedly I may be odd, but it works better for me than printing them all out.
Cheers.
People are crap at multitasking (Score:3)
Why is that "Wrong"? Because your brain doesn't work that way? That's how I've always browsed and only recently did it become terrible.
It's "Wrong" in the sense that just because you can do something it doesn't mean it is a good idea. Your brain doesn't work that way either even if you think it does. People are demonstrably terrible at multitasking. That's why we can't talk on the phone and drive at the same time safely. You cannot possibly convince me that you are doing anything productive with most of those tabs. It's merely a form of hoarding. Opening that many tabs chews up a ton of memory and then you complain that things get sl
Re: (Score:3)
engine to the redline before every shift
And yet in some use cases that's the desired behavior.
complaining that your car overheats is wrong
And Michael Schumacher would have good grounds to go back to his engineering team and tell them they screwed up.
But I repeat myself. Why is it "wrong"? Because you don't use your tools the way others do doesn't mean that either case is wrong. It means that some tools are poorly designed for some use cases.
Re: (Score:2)
I run hundreds of tabs in firefox and it runs fine. Things really only take a dump if I load tons of videos at the same time. But that is inane so I don't expect that to work well.
But I don't see most of the problems described either. My tabs aren't all auto loaded when I start up firefox, they only refresh if I tell them too and they only load up at all if I return to them. I do shutdown my machine each day though, so its not like they're all loaded. There's no delay when I create a new tab view, its basic
Form over Function, thats why (Score:5, Informative)
Or as this article puts it more eloquently:
https://medium.com/@eshan/the-... [medium.com]
With open source software like Firefox it is more a failure of having the right people (engineers) at the right positions (the decision making ones). Instead they are left chasing the latest widget "feature" that no one ever asked for.
For a long time I thought that the "standard" Firefox was already bad, until I switched to my Tablet 80% of the time: Firefox for Android is just plain torture. Multiple crashes every day. Most of the time when clicking on a text field or adress bar, FF apparently hates those. other times its idling and crashes for no reason. Not to mention its hunger for memory. Unfortunately it is the only browser there that has all the features I need via addons (Mostly adblock and noscript). Opera is much better, unfortunately its adblocking is faulty and it does not recognize my hardware keyboard at all.
I remember my wikiwalks back in 2005-2007: I used to have 50-130 tabs open and nothing bad happened. That was on a Laptop from 2004 with a single core CPU. So really, it has gone downhill by orders of magnitude.
Comment removed (Score:5, Insightful)
Sorry, Tech is not Magic (Score:2)
1. Load page layouts, scripts, graphics, videos, & sound so that when you intentionally trigger something, your requested action happens *mostly* quick.
2. Sites want to earn money. That means that non-mission-related stuff (advertising) must get loaded as well. This is often worse than the simple text that most of us are actually seeking out.
3. Weed out malicious crap. Given the sheer amount of malicious crap out there, we're asking out browsers to do 9
Awkwardly repurposed... (Score:3)
If you feed them retro websites, or sites designed by people with classic taste, contemporary browsers are faster than a bat out of hell with rocket boosters. That just doesn't help you against modern websites; which are both bloated(typically with slow-as-hell 3rd party ad code, just to add injury to insult); and still somewhat awkwardly trying to move from a caching model mostly designed to keep pictures from being unnecessarily retransmitted to something suitable for letting full applications pick up where they left off.
Are browsers slow or pages/sites bloated? (Score:2)
Deep stacks (Score:5, Insightful)
I don't know the specific code in question, but I have seen enough code to have a theory.
Long-lived software projects that implement programs with complex features (and web browsers have an astounding array of features), especially those that interact directly with users, who put programs into a position of continually having to respond to extremely complex sets of inputs (just think of how many valid inputs there are to a web browser at any one moment), tend towards a style of implementation that can best be described as "layers upon layers upon layers of framework".
One aspect of programming that many engineers are, in my experience, not very good at, is deciding when to simplify frameworks versus making them more complex. I think it comes from a fear that many people have of painting themselves into a corner -- we've all seen code so inflexible that it makes extending it at a later time difficult, and I think that many people respond by going too far in the opposite direction - to avoid painting themselves into a corner, they put 100 doors in every room so that there is always a way out.
The end result is that every aspect of the complex program is designed to be extensible well beyond anything that will ever likely occur in practice. And the interactions between these complex layers of framework become so complex themselves that new layers have to be invented just to try to simplify things and allow any hope of rationally moving forward.
So what you end up with is incredibly deep stacks of function calls for almost any action, as various extensibility layers are passed through, along with the layers that consolidate previously implemented extensibility layers into simplified layers that more directly match the actual requirements of the program.
I have occasionally seen stack traces from programs like Firefox, and I expect Chrome is no different, and the depth of the stack at the point of a crash is always somewhat breathtaking. You may end up going 30 - 40 layers deep before you actually get to a piece of code that has a tangible effect on the state of the program.
Now imagine that a particular user input requires running through a function that has to call out to several parts of another framework layer, and you're going to be paying the deep stack penalty multiple times.
What you end up with is a large code base that does everything necessary, but in a way that is embellished to a nearly pathological degree, where every action takes 10 - 20 times longer than it would had that action been encoded much more directly and with far fewer framework layers.
The advantage of such a large code base is that it has enough flexibility to rarely, if ever, require a complete redesign as new feature requirements come up.
The disadvantage is that it will never operate as efficiently as a more directly coded program, and you get user interfaces that require executing literally billions of machine instructions just to effect fairly simple changes to its internal state machine.
That's your web browser.
Re: (Score:2)
A second or two to open a blank page?! (Score:2)
Sorry.. but something is seriously wrong if a 2014 iMac takes even a second to open a blank tab.
I'm sitting at a 2009 iMac that opens blank pages nearly instantly.
I call PEBKAC
Try Firefox (Score:2)
Firefox opens new tabs instantly, not only after two seconds. If I hold down ctrl-T it pretty much keeps up with the key repeat rate. Switching between tabs is pretty much instantaneous as well. And when you restart your browser, Firefox will only reload any given tab when you click on it, and not try to load all of them at once. In other words, isn't the problem more that the OP is using the wrong browser? Or, alternatively, is running that browser on hardware that is just too low on memory (something that
I don't have those problems (Score:2)
Stop trying to surf the web with your toaster.
But seriously, I expect the answer to performance issues has something to do with pleasing the people who complain about browser memory usage. If you open 15 web pages with tons of graphics and videos and whatever they're going to use a lot of memory. If your computer can't handle it all, something is going to give, and a performance hit has to happen SOMEWHERE as a part of that.
When you open that photo, you're ONLY opening the photo. Easy. When you open a web p
a better question... (Score:3)
Why are most websites so badly designed and so bloated they load slow?
Slashdot before 2.0 was screaming fast. Then they introduced all the JSON garbage that slowed it down. It's all about being pretty over functional.
Browsers are fine. Web design is a disaster. (Score:3)
As long as web sites pack every possible space with bandwidth sucking animations. As long as refresh is used with wild abandon by incompetent web designers, browsers will appear slow.
IE 6 only uses 64 megs (Score:4, Insightful)
Just saying ...
First Post! (Score:5, Funny)
What do you mean, 'slow'?
Blame SSL/TLS (Score:3)
Much of the recent blame should be placed on the new universality of using SSL/TLS for everything.
The problem is, negotiating a cold SSL/TLS handshake from scratch takes a certain amount of time... and there's very little you can do to speed it up with current standards. Adding certificate-revocation lookups compounds the problem and adds even more time. In the past, a site that "used SSL" might need to do one or two key exchanges for one or two different hosts. Now, a single page might require a dozen or more key exchanges... and some of those key exchanges might not even be known by the browser to be necessary until after it's done the first few (because some script delivered via https references some other asset via https on a different host).
SSL/TLS itself is generally a good thing... but the WAY it's currently being used was NEVER seen as a realistic use case 20 years ago when it was first implemented. Simply put, SSL/TLS key exchange and CRL doesn't scale well in their current form.
This is also part of the reason why it's so common to end up with "five bars of LTE signal strength, but no working data-connectivity" -- mobile apps are "chatty" to begin with, and many of them do a shit job of keeping https sessions alive between requests, so every single background https request requires yet another new handshaking and revocation-lookup. In areas where you have a cell site with limited backhaul connectivity and occasional surges in the number of users (say, a cell tower near a large state park serviced by only a T-1 line or two... on July 4th, when 60,000 people might descend on the park for a few hours), the endless key exchange traffic ITSELF can almost fully-saturate the tower's backhaul capacity.
This post makes me feel old (Score:4, Insightful)
Ok, I am very astonished by this question "Why are browsers so slow?" which since you're most likely a millenial you might as well re-phrase the question as "Why are browsers slow AF?"
No offense but you obviously have no idea what a slow browser really is. Try Netscape 6 or IE6 and let me know if you think Chrome is still slow. First of all, are you sure you can attribute all the slowness to the browser itself? Did you crack open the modern browser developer tools that we all have now (hint: Firebug and Chrome developer tools didn't used to exist until a few years ago) and look at the network tab or equivalent to make sure that the web server/REST service/whatever isn't taking a long time to serve back data? Better yet try profiling your own javascript code with console.log, console.time/console.timeEnd. Since you can use an identifier, you can even do it asynchronously, how fancy is that? I seriously doubt you've done any of these exercises because most modern browsers take javascript, compile it into native code and cache which is about the fastest you're going to get javascript to run. Microsoft completely rewrote it's rendering engine multiple times the latest of which is Edge. Firefox, Chrome and Safari have all had similar efforts. I've been a web developer for a number of years and I can tell you, the stuff we have now is lightning fast.
You try serving up a web page to Netscape Navigator using a cgi-bin perl script from an old version of Apache and let us know if you think modern tech is still slow.
It just appalls me. No matter how much better we make technology there is always a generation that comes along and tells us all of our effort are crap because they think everything needs to be bigger, better, faster. If you want to complain, you get engaged and make it what you think it ought to be. Until then you have no room to complain if you're just going to comment from the peanut gallery. Gah.
HTML tables are too slow (Score:3)
One consistent performance problem I see across browsers is large tables take forever to render even with static column attributes. You can load an excel spreadsheet with tens of thousands of rows instantaneously.
The same data in a table pegs a core and consumes a GB or more of RAM with minutes or more delay. Often as the number of rows increase there is a huge corresponding drop-off in performance until it becomes practically unusable.
Otherwise I have no real issues with browser performance. Sure browsers like IE do the darndest things you've ever seen sometimes including getting so slow as to be unusable when display large ASCII text files. That's right.. not complex HTML but text rendered as TEXT.
Browsers are like everything else. The EE's give us incredible hardware and we go out of our way to waste it with lazy sloppy coding simply because it's cheaper not to care.
Browsers are the same way.. all kinds of hard work on fancy algorithms and optimizing performance ... obliterated by "developers" who think HTML/JavaScript/CSS are "too hard" and instead insist on piling on layer upon layer, framework upon framework, widget after widget, piecemeal XMLHttpRequest after XMLHttpRequest, fill site with social media bugs, cross site trackers, ads/malware, use third party bugs to get web stats because you are too lazy to install a stats package to analyze your own access logs. Some sites just enjoy selling out their customers to stalking firms who often cross sell-out customers to other stalking firms leading to hilarious trees of connections when accessing a single page. Yet others are completely oblivious to what is going on behind the scenes.
Web sites written by people who care about wasting their users time load instantaneously. It's amazing. Never really had issues with browsers themselves not being responsive divorced from the bullshit occurring within them. I'm not really sure how what is being described by TFA is even possible given so much runs in separate process/memory space these days.
hundreds of tabs? (Score:3)
I think I found your problem:
Which takes ages if you have a hundred of tabs.
I can't imagine my browser being remotely functional if I had hundreds of tabs open.
Too many 3rd party scripts being called. (Score:5, Insightful)
Even with no-script (or especially with it as may be the case), I try to load a page. It partially works, but needs some scripts to work properly. I tell no-script to allow all this page, but then when it reloads it wants even more scripts from more places. Tell no-script OK again, and then when it reloads, it still wants more, or some of them change.
Sometimes takes 3 or 4 tries allowing and reloading just to get pages to render by the time I've approved the 50 other sites they want to load content from.
Generally speaking, I try to avoid websites that go overboard this way, but it is sadly getting to be way too common. Sometimes the no-script list is so long I have to scroll through it.........
Re: (Score:3)
For one thing, users expect to see thumbnails of huge images, sync status badges on Dropbox or other online backup services, multilingual filenames written in scripts whose coverage requires multiple fonts, etc.
Re: (Score:2)
As to refresh, I think that's become a user expectation that you see the most recent information when you pull up a tab.
Has it also "become a user expectation" that you pay once for high-volume (1000 GB/mo) wired Internet access at home and pay again for single digit GB/mo cellular Internet access just to "see the most recent information" rather than what was current when you loaded the page an hour ago before boarding transit?
Re: (Score:2)
'And with ten or fifteen open tabs it eventually becomes sluggish as hell.'
I don't think that's the standard use case for testing, nor should it be. What the hell are you doing with that many tabs open.
"Unfortunately, modern browsers are so stupid that they reload all the tabs when you restart them. Which takes ages if you have a hundred of tabs."
Again, good lord. Hundreds of tabs? What are you even doing.
As to refresh, I think that's become a user expectation that you see the most recent information when you pull up a tab. Having to manually do it isn't something a standard user is going to do.
Maybe what you're looking for is to have 'power user' settings in the browser, so you can keep your hundred tabs open.
I normally have perhaps 30~40 pages live (I don't like tabs, I prefer new windows) when surfing in an ordinary fashion. From time to time, I might even get up to a hundred. My browser does work fine, or to put it another way, roughly the same as when there are only one or two pages open.
What am I doing? Simple ... I refuse to be interrupted by crap. So when I am reading a news story (for example) I stay on that news story page and read the whole thing, then close the window (or tab, if you prefer). And if t
Re: (Score:2)
I do that, but to a much smaller scale. And I suspect that a large proportion of users operate in a similar way.
Deep dive research with 100+ open tabs is not a standard use case for web browsing, it's just not.
Re: (Score:2)
Among other things, I use hundreds of tabs (and loathe to restart my browser) because I want to read whatever content is there, later. I don't want a live link to something that may be taken down or changed by then. I just want what it is.
Re:Because Use Cases (Score:5, Insightful)
Uh, I have 23 tabs open in one window and 39 in another (Vivaldi). I have a dozen open in MS Edge. Lots of them are work related or reference tabs. My Vivaldi tabs are things that I'm reading or things that I always keep open.
I'm not unusual. When I walk past other people's machines, everyone has dozens upon dozens of tabs open.
I get that I'm a huge nerd and that my use case is often niche, but browsers are just as much work tools as entertainment tools these days. It IS a useful test case. Hundreds of tabs is maybe pushing it, but it's definitely conceivable.
Most of the webpages that I have open ARE static content--they're a news story or a review or something that doesn't immediately need updating. It makes it much faster to load. Only a few things like Reddit or Facebook need a lot of refreshing.
Re: (Score:3)
Virtual desktops.
I generally have 6 to 8 virtual desktops, each dedicated to a particular type of task or project.
Within each desktop I typically have a browser, 2 or 3 if the browsers are showing fundamentally different things.
Within each browser I'll have anywhere from 1 to 6 (rarely more) tabs on different slants on the same topic.
So if I'm working on AWS that would be in a virtual desktop. I might have one browser for dynamic stuff (like their console) and one for documentation. The documentation bro
Re:Because Use Cases (Score:4, Informative)
I use the Tree Style Tab extension in Firefox. Not only I get the tabs on the left side of the screen with easily readable titles, the child tabs are found in the tree creating simple organization automatically. Not sure if I would give up adblock or the tree style tabs if I had to choose
Re:Because Use Cases (Score:5, Insightful)
I don't think that's the standard use case for testing, nor should it be. What the hell are you doing with that many tabs open.
Speak for yourself. I normally open a bunch of websites I read, and then just middle-click to open the articles I find interesting in separate tabs. You can quite easily end up with dozens of articles. I wouldn't consider that to be unusual at all.
Re: (Score:3)
Use case example?
I'm searching for X programming topic. I enter it into Google.
Because I'm not a moron, I hold control-click and load all top-ten results in tabs. I then quickly compare-and-contrast them to make sure I getting the complete picture and not a heavily biased, tunnel-vision answer.
TADA.
Also, I open windows for separate "tasks" and tabs for related information to those tasks. (For example, one window is personal tabs like YouTube music and Gmail.) By time I end a work day, I've had over 180 tabs
Re: (Score:3)
And each one of them doing background work just like a regular application would, at least nowadays. Ask the idiots designing the pages and loading in a bunch of crap, not the browser writer.
If I were running 300 programs on my OS sucking up 100% of the CPU would you call me daft? Sure as shit you would.
Re: (Score:3)
Yes, pages explicitly saying do-not-cache for most if not all elements by default, in order to placate advertisers and trackers is one of the main reasons. Combined with going to https, which defeats proxy caching[*] and needs extra computation, the experience is much slower.
[*]: This is why Google pushes and advocates https so much - don't be fooled into thinking they truly care about your privacy. Really, they make money on selling your private information and history. With proxy caching, they can't
Re: (Score:3)
HTTPS doesn't "defeat" a caching proxy near the client. It only requires users of said proxy to take the step of trusting the proxy's root certificate. The one thing it "defeats" is an intercepting proxy that tries to hide its existence from its users.
Re: (Score:3)
If the endpoint's certificate doesn't check out, the proxy server refuses to issue its own certificate for that site.
That doesn't tell you what's wrong. And it always hides the real security details, showing the proxy server's security details instead.
And doesn't help if you need to access a site for which you have the CA but the proxy doesn't.
In practice, https pretty much defeats proxy caching. Which Google just loves.
Re:Add RAM? (Score:5, Informative)
I've done that. Even after maxing my laptop at 2 GB and replacing the HDD with an SSD, some things remain slow. The CPU graph shows one core maxed out (Firefox 50, which disables e10s because of add-ons) or both cores maxed out (Firefox 51 beta, which uses e10s), which wouldn't happen if it were swapping. I'd replace my laptop, but a new 10" has been hard to find for the past four years [slashdot.org] (except for laptops that run Chrome OS and beg the user at every boot to wipe Crouton).