Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Windows Operating Systems Software Hardware

Why Use Virtual Memory In Modern Systems? 983

Cyberhwk writes "I have a system with Windows Vista Ultimate (64-bit) installed on it, and it has 4GB of RAM. However when I've been watching system performance, my system seems to divide the work between the physical RAM and the virtual memory, so I have 2GB of data in the virtual memory and another 2GB in the physical memory. Is there a reason why my system should even be using the virtual memory anymore? I would think the computer would run better if it based everything off of RAM instead of virtual memory. Any thoughts on this matter or could you explain why the system is acting this way?"
This discussion has been archived. No new comments can be posted.

Why Use Virtual Memory In Modern Systems?

Comments Filter:
  • by Alereon ( 660683 ) * on Thursday December 04, 2008 @06:00PM (#25995061)

    Memory exists to be used. If memory is not in use, you are wasting it. The reality is that your system will operate with higher performance if unused data is paged out of RAM to disk and the newly freed memory is used for additional disk caching. Vista's memory manager is actually reasonably smart and will only page data out to disk when it really won't be used, or you experience an actual low-memory condition.

  • Turn it off, then! (Score:5, Insightful)

    by Jeppe Salvesen ( 101622 ) on Thursday December 04, 2008 @06:06PM (#25995131)

    We who know what we are doing are free to take the risk of running our computers without a swapfile.

    Most people are not in a position where they can be sure that they will never run out of physical memory. Because of that, all operating systems for personal computers set up a swapfile by default: It's better for joe average computer owner to complain about a slow system than for him to lose his document when the system crashes because he filled up the physical memory (and there is no swap file to fall back on).

  • by Anonymous Coward on Thursday December 04, 2008 @06:09PM (#25995173)

    You must be confused about virtual vs. physical memory.

    Indeed. When I read this story my knee jerk reaction was "please be gentle." And thankfully the first +5 post on this story is informative and helpful and relatively kind.

    I fear the "turn off your computer, put it in a box and mail it back to the manufacturer" hardcore hardware experts that are going to show up in 3 ... 2 ... 1 ...

  • by TypoNAM ( 695420 ) on Thursday December 04, 2008 @06:13PM (#25995255)

    Actually no the author was correct in Microsoft's Windows' terms. This is the exact text used in System Properties -> Advanced tab under Virtual memory:
    "A paging file is an an area on the hard disk that Windows uses as if it were RAM."

    You might think well they said paging file not virtual memory, well click on Change button and you'll see the dialog pop up named "Virtual Memory" of which you can specify multiple paging files on multiple drives if you wanted to. Defaulted to a single paging file on the C:\ or boot drive. So blame Microsoft for the confusing use of virtual memory and paging file back and forth. I guess they mean by virtual memory as in the collection usage of paging files after the fact (for those situations where there's more than one paging file used, just like on Linux you can have more than one swap file in use).

    Anyway, I too have seen Windows 2000 and XP just love to make heavy use of the paging file even though there is clearly enough physical memory available. Some friends of mine have even disabled Windows from using a paging file completely, at first you will get a warning about it, but other than that they have reported better system performance and no draw backs noticed since then. This is on systems with at least 3GB of RAM.

  • by ivan256 ( 17499 ) on Thursday December 04, 2008 @06:17PM (#25995331)

    What you're actually complaining about is that Windows did a poor job of deciding what to page out. Sure, you could "turn off swap" if you have enough memory, and you won't ever have to wait for anything to be paged in.. But your system would be faster if you had a good paging algorithm and could use unaccessed memory pages for disk cache instead.

  • I prefer none. (Score:5, Insightful)

    by mindstrm ( 20013 ) on Thursday December 04, 2008 @06:18PM (#25995343)

    This should generate some polarized discussion.

    There are two camps of thought.

    One will insist that, no matter how much memory is currently allocated, it makes more sense to swap out that which isn't needed in order to keep more free physical ram. They will argue until they are blue in the face that the benefits of doing so are good.
    Essentially - your OS is clever and it tries pre-emptively swap things out so the memory will be available as needed.

    The other camp - and the one I subscribe to - says that as long as you have enough physical ram to do whatever you need to do - any time spent swapping is wasted time.

    I run most of my workstations (Windows) without virtual memory. Yes, on occasion, I do hit a "low on virtual memory error" - usually when something is leaky - but I prefer to get the error and have to re-start or kill something rather than have the system spend days getting progressively slower, slowly annoying me more and more, and then giving me the same error.

    This is not to say that swap is bad, or that it shouldn't be used - but I prefer the simpler approach.

  • Multics (Score:5, Insightful)

    by neongenesis ( 549334 ) on Thursday December 04, 2008 @06:38PM (#25995597)
    One word: Multics. Way too far ahead of its time. Those who forget history will have to try to re-invent it. Badly.
  • Re:File - Save (Score:4, Insightful)

    by JSBiff ( 87824 ) on Thursday December 04, 2008 @06:39PM (#25995617) Journal

    What would you do instead of file save? Continuous save, where the file data is saved as you type? What if you decide the changes you made were a mistake? I think one of the basic premises, going a very long way back in the design of software, is that you don't immediately save changes, so that the user can make a choice whether to 'commit' the changes, or throw them away and revert back to the original state of the file. As far as I know, Notepad will only temporarily stop the shutdown of the computer, to ask you do you want to save the file - yes/no? I don't see how that is such a bad thing?

    Now, you might say that the solution for this is automatic file versioning. The problem is that if you have continuous save, you would either get a version for every single character typed, deleted, etc, or else you would get 'periodic' versions (like, a version from 30 seconds ago, a version from 30 seconds before that, etc) and pretty soon you'd have a ridiculous number of 'intermediate' versions. File versioning should, ideally, only be saving essentially 'completed' versions of the file (or at least, only such intermediate versions as the user chooses to save [because, if you are creating a large document, like a Master's Thesis, or book, you will probably not create it all in a single session, so in that case, you might have versions which don't correspond to completed 'final products', but you probably also don't want 1000 different versions either], instead of a large number of automatically created versions).

  • by Dadoo ( 899435 ) on Thursday December 04, 2008 @06:44PM (#25995675) Journal

    So blame Microsoft for the confusing use of virtual memory and paging file

    I'm no Microsoft fanboy, but I don't think you can blame them for this, especially when "virtual memory" originally did mean what the OP thinks it does. I'd like to know when the definition changed.

  • by DragonWriter ( 970822 ) on Thursday December 04, 2008 @06:50PM (#25995735)

    The answer is basically to free up RAM for disk cache, based on a belief (sometimes backed up by benchmarks) that for typical use patterns, the performance hit of sometimes having to swap RAM back into physical memory is outweighed by the performance gain of a large disk cache.

    Whether or not it works (and I'm not sure how well it does), there's something odd about swapping out RAM contents to disk so that you can mirror disk contents in RAM.

  • Re:Only 4 GB? (Score:2, Insightful)

    by sexconker ( 1179573 ) on Thursday December 04, 2008 @07:01PM (#25995911)

    I'm all for shitting on FF's memory bloat, but you clearly have 5 tabs open in that one FF window (doing who knows what), and are clearly having the installing of the skype plugin.

  • by Calibax ( 151875 ) * on Thursday December 04, 2008 @07:06PM (#25995979)

    No, I don't think the OP is confused.

    Back in the days of mainframes only, say before 1980 or so, all the systems I worked on (NCR, IBM and Burroughs) used the term "virtual memory" to refer to secondary memory storage on a slower device. Early on the secondary device was CRAM (Card Random Access Memory) and later it was disk.

    But the point is that Virtual Memory originally referred to main memory storage on a secondary device. Furthermore, this is still the term used for paged storage in Microsoft Windows. Check out the Properties page on the "Computer" menu item on Vista or "My Computer" icon on XP which talks about Virtual Memory when setting the size of the paging file.

    The OP is totally correct in his use of Virtual memory both by historical precedent and by current usage in Windows.

  • by Trepidity ( 597 ) <[gro.hsikcah] [ta] [todhsals-muiriled]> on Thursday December 04, 2008 @07:11PM (#25996049)

    One problem is that there are relatively frequent types of disk-access patterns where caching them gives little to no benefit in return for the paging out of RAM it requires. A virus scan (touching most of your files exactly once) is one canonical example. Media playback (touching a few large files in sequential block reads) is another.

    The difficult question is how to exclude these kinds of wasted caching while still retaining the benefits of caching frequently accessed files, and not introducing excessive overhead in the process.

  • by Lord Byron II ( 671689 ) on Thursday December 04, 2008 @07:17PM (#25996123)
    There has been this ridiculous notion floating around recently that swap space and paging files are relics and need to be eliminated. You can only safely eliminate them only so long as you're 100% confident you'll never use more RAM than you actually have. But there are lots and lots of memory hogging applications - video editors, image editors, scientific applications, etc. And when you consider that a web browser can eat up to 300MB of RAM, it shouldn't be hard to imagine a multitasking user running out by using too many little programs.
  • by 0xABADC0DA ( 867955 ) on Thursday December 04, 2008 @07:49PM (#25996547)

    Virtual memory only starts impacting performance when pages are being swapped in and out, because all your processes need more resident memory than you actually have.

    Actually, virtual memory adds quite a bit of overhead to everything the processor does. Just using virtual memory slows down the processor by ~6%, and using it in a traditional OS (separate page tables, processes isolated using virtual memory) slows it down by ~20%. These numbers are from Singularity research, fyi, and don't even count the memory cost from the page tables.

    So the question I'd ask is why use virtual memory in a modern system?

    Singularity, jxos, JavaOS show that it is not necessary or efficient to run programs using 1970s style protection domains. The actual answer is "because of network effects." So much code is written that requires memory protection that we have to also use virtual memory. Oh yeah, and Dvorak keyboard is better than qwerty you free market uber alles freaks!

  • by D Ninja ( 825055 ) on Thursday December 04, 2008 @07:58PM (#25996653)

    So, in true CS fashion, I will be lazy and refrain from duplicating effort ;)

    In true CS fashion, you wouldn't have posted in the first place. ;-)

  • Re:Can't hibernate (Score:4, Insightful)

    by drspliff ( 652992 ) on Thursday December 04, 2008 @08:37PM (#25997103)

    You'd have thought after all this time they could've corrected one of the most annoying "features" which stops me using Windows for any amount of time? It certainly appears like after X amount of inactivity (whatever it may be classified as) stuff just gets swapped out even if you have enough physical memory!

    Considering the way I normally work is to have many applications open, perhaps an IDE, a handful of terminals, a web browser, e-mail client, then spend X amount of time with one application, then switch to another (test/deploy/whatever), then maybe check e-mail & web, by the time I get around to switching to my next task the previous applications have at least partially been swapped to disk.

    When I was using Windows at work, by the end of each day I was getting so incensed by it'd be a big hands in the air and muffled swearing whenever it happened, a total productivity killer.

    Lets just say I'm back in Linux & Solaris land now, I have almost the same set of applications open with no problems - and that's on top of running my testing environment on the same machine.

  • by Fulcrum of Evil ( 560260 ) on Thursday December 04, 2008 @08:38PM (#25997113)

    I think you might have awfully high expectations of the paging algorithm

    Can you blame him? Linux does a far better job, so you'd think the biggest software company in the world could put in a good show.

  • by Anonymous Coward on Thursday December 04, 2008 @08:46PM (#25997205)

    I think that "Virtual memory" is a correct designation for the swap file. Don't confuse with "virtual address space".

    If you think this through, it's simple and logical; real memory is RAM, of course, and *virtual* memory must be something that acts as RAM, but it's not. The virtual address space is NOT memory, it's just a location where real (RAM) or virtual (disk) memory is mapped to.

  • by blincoln ( 592401 ) on Thursday December 04, 2008 @08:55PM (#25997291) Homepage Journal

    What you do realize is 99% of the human population is dumber than headless chickens.

    Most people are not incredibly knowledgeable about computers. There's a big difference. Pretty much everyone is very good at something. That's why some people get paid to sell merchandise, design hardware, repair engines, cook food, synthesize chemicals, or perform surgery, and others get paid to solve computer problems.

  • Re:Can't hibernate (Score:3, Insightful)

    by daveime ( 1253762 ) on Thursday December 04, 2008 @09:09PM (#25997401)

    When did Microsoft go into the disk drive business ?

    Go spread your anti-ms bs somewhere else please.

  • by Siridar ( 85255 ) on Thursday December 04, 2008 @09:35PM (#25997641)

    I'm fine with folks not knowing about computers. That's cool. The thing that annoys me, though, is that they're /proud/ of it. Its like its a badge of honor! Any sort of discussion about computer issues will always bring up some yahoo who says "Oh, I don't know a /thing/ about that! hur hur, in my day, all we had was pen and paper..." etc etc etc. The fact is, knowledge - basic knowledge - of computers is only going to get more important. Hiding your head under a rock isn't going to magically make it go away.

    And its not the age thing, either - I've got a friend who is in his 70's, and his knowledge of technical things is way up there - he's a pure linux guy, uses myth to serve TV content all around the house, and is a very active member in the local unix club. Some people just don't seem to want to learn the basics.

  • by The MAZZTer ( 911996 ) <.moc.liamg. .ta. .tzzagem.> on Thursday December 04, 2008 @10:38PM (#25998185) Homepage

    The amount of physical memory you have has nothing to do with the 2gb|2gb split or 3gb|1gb split. You can use your full 3gb either of physical memory either way, since the virtual mappings (which is what is split) is different for each app, so two apps can each use 1.5gb of user space and take up 3gb total memory.

    The actual reason for the /3GB flag is to enable OS support for special applications which explicitly opt-in for up to 3gb of memory, usually to help boost performance (they can have a larger cache or buffer or whatever)... normal apps will still only be able to use up to 2gb because for compatibility. Apparently some apps use the highest bit of a 32-bit pointer to store a boolean, since with 2gb of user space that highest bit is always 0. Adding an extra 1gb allows that bit to be 1 which could seriously confuse the program, likely causing a crash.

    SQL Server includes support for 3GB, IIRC.

    I'm not sure if there's a downside to using 3gb (there must have been for it to be optional and hidden like that) especially since apparently Vista turns it on by default... however IIRC while each app gets it's own 2gb/3gb of userspace, kernel address space is shared between all apps, so your kernel only gets 1gb total memory. That includes all running drivers. I imagine the XP team thought it was better to allow for the total 2gb for drivers, especially since most apps couldn't take advantage of the full 3gb anyway (as I described above).

  • by ChangelingJane ( 1042436 ) on Thursday December 04, 2008 @10:48PM (#25998275)
    Part of it too is that *everybody* has their stupid moments. The kind where, afterward, you realize just how dumb you were. Some like to pretend they've never been guilty of it, but they're often the worst offenders. (Same goes for doing asinine things while driving.)
  • by Just Some Guy ( 3352 ) <kirk+slashdot@strauser.com> on Thursday December 04, 2008 @11:04PM (#25998393) Homepage Journal

    Most people are not incredibly knowledgeable about computers. There's a big difference.

    Well, when you work tech support for any length of time, you learn that many people utterly lack any problem-solving skills. Geeks often equate problem-solving skills with intelligence, hence the (not unreasonable) impression that many people are not very smart.

    Example: people calling to complain that the ISP I worked for sucked and was a pack of incompetent jackasses because they couldn't dial in. More specifically, they could not turn their computer on. This was because lightning had struck the house and destroyed the entertainment center and kitchen appliance, and presumably their computer as well. Their interpretation of events was that they couldn't turn on their computer because we ran a poor ISP.

    Granted, the perception is exacerbated because by definition you only get the people unable to fix their own problems. Still, it's hard to answer "do I have to plug in my CPU (modem) to check my email?" a few hundred times without losing a little faith in humanity.

  • Re:Agreed (Score:3, Insightful)

    by Rary ( 566291 ) on Thursday December 04, 2008 @11:44PM (#25998689)

    You really don't want hundreds of megabytes of BloatyApp's untouched memory floating about in the machine. Get it out on the disk, use the memory for something useful.

    That's the part I've never understood, and I suspect the article submitter is having the same problem I have with this.

    You see, if you've got a ton of physical RAM, then the assumption is that much of it is already just sitting there unused. If you had "something useful" that needed to be done, there's plenty of memory available in which to do it. So why swap out "BloatyApp's untouched memory"? This just makes it so that even more memory is going to waste?

  • by The Mighty Buzzard ( 878441 ) on Friday December 05, 2008 @12:50AM (#25999099)

    Never going to happen.

    When cars were first sold to the public, if you bought one you'd damned well better know how to fix it yourself. Fast forward to now. Plenty of idiots still buy cars and are completely fucked when it comes time to do something as minor as changing the oil or spark plugs. </gratuitous car analogy>

    That's around a hundred years of people refusing to learn really simple shit. What makes you think it will be different with computers over a shorter timespan?

  • by lgw ( 121541 ) on Friday December 05, 2008 @01:54AM (#25999451) Journal

    Your bloated fat programs like mozilla and your game will have plenty of pages they never plan to use, because of memory fragmentation and other issues. In a typical malloc implementation, you will end up with pages that have NO data in them but still can't be returned to the oprating system. Paging lets these be dumped at essentially no cost.

    No, no, no! Paging always has a cost, paid in disk I/O. And if you *never* run out of memory, why pay that cost? Plus, in practice, OSs tend to swap out stuff they shouldn't, and end up swapping it back in again, even though there was no memory shortage. Windows particularly sucks at this, but no one is perfect.

    Again, if my box has more memory than will ever be needed by my OS, my apps, and all of my non-streaim files together, what crazy reason could there possibly be to swap? I swear, it's like it's some religious ritual or something!

  • by Anonymous Coward on Friday December 05, 2008 @02:38AM (#25999669)

    In my case, it's both.

    Outwardly, I provide polite and gentle answers.

    Inwardly, I hate the user. A lot.

    I really wish I didn't feel this way. Intellectually I know that some people need a lot more hand-holding than others when it comes to computers and technology. But for some reason it completely and utterly infuriates me.

  • by Eivind ( 15695 ) <eivindorama@gmail.com> on Friday December 05, 2008 @03:00AM (#25999787) Homepage

    Actually, that may be politically correct, but it's not true.

    Sure, everyone has strong and weak sides, but nevertheless, the tendency is for some people to know a lot about a huge array of topics, and other people being pretty unknowledgeable about pretty much everything.

    That nobody can specialize in everything is however true.

    You do need one surgeon, and a different cryptographer, true. But still, the odds is that either of them will know more about the basics of the work of the other than a random person you ask on the street.

  • by macraig ( 621737 ) <mark@a@craig.gmail@com> on Friday December 05, 2008 @03:13AM (#25999829)

    Ummm... no. There are a statistically significant number of humans who aren't notably good at anything. I have unhappily encountered too many of them, both in and out of tech work. This is akin to actually believing that "all men are created equal" merely because it would be really really neat and make you feel all warm-and-fuzzy inside if it were true.

    Even if your pollyanna perspective was true, being competent at some task doesn't directly equate with an absence of dumbassery. There are numerous species of "dumb" creatures that can be trained to memorize some task and then mimic (repeat) it perfectly ad nauseum... including H. sapiens. An ability to memorize and mimic doesn't equate directly with intelligence. It's a precursor, a prerequisite, perhaps, but not the Real McCoy.

    A shocking number of humans, including many regarded as "average" by testing standards, never actually reach a state of true intelligence. Too many of them are profoundly ignorant and quite determined to remain that way.

  • by mathew7 ( 863867 ) on Friday December 05, 2008 @03:52AM (#26000031)

    I am not an expert at systems, but seems to me that he is saying that Windows tries to write disk buffers ("file cache"?) to the pagefile in order to make more space.

    Well, you did get this idea wrong. Disk cache is treated differently and IS NEVER swapped out (MS is not THAT stupid). However, the kernel may swap an application which you just minimized in order to increase the cache size.
    My main frustration with NT-based kernels is that you cannot limit the cache size. At my work a virus scanner runs weekly. Working during scanning is murder, because windows grows the cache size to the whole physical memory and swaps out everything, except maybe the current application, but I always use 3-4 applications which require real-time response. I heard they changed this behavior in Vista but I have not tested it. At least in Win98 I could (and always did) limit the cache to 1/4 of physical RAM and it worked perfectly.
    As a comparison, in linux you barely get any swapping to disk if you use up to 80% (estimation, no hard evidence) of your physical RAM. I always have 784MB of swap partition regardless of RAM (512MB-4GB). The swap size is because of partition alignment. I always align to multiple of 1000 cyclinders, but I give 100 cylinders from the linux partition to swap space. This is because if a HDD failiure occurs (SW of HW), I know where to look for partitions.

  • by Guido del Confuso ( 80037 ) on Friday December 05, 2008 @05:40AM (#26000557)

    Uh, no. When cars were FIRST sold to the public, if you bought one you could afford to pay one of your servants to maintain it.

    Besides, that's still a bad analogy, because it's not that most people couldn't change the oil or spark plugs on a car, it's just that it's too much of a pain in the ass for people to do it. I could teach anyone how to do it in theory. You just follow a few simple steps. But it's much easier to simply pay a guy 25 bucks every couple of months than have to crawl under the car, muck around with dirty oil, figure out where to dispose the old stuff, and so on. Given that, there's not really any real need for me to know how to do it, any more than I need to know how to perform surgery or cook escargot. Although in point of fact I do know how to change the oil on my car (having changed the oil on numerous motorcycles purely for the fun of it), I see no reason to call anyone who didn't have a clue how to do it an idiot.

    Computers are getting simpler. They are getting to the point where it makes sense to learn how to use them and how to fix them when something minor goes wrong. This is the standard level for computer literacy. A better car analogy would be to observe that when cars were first sold to the public, they were complicated to operate, difficult to start, and not many people saw the use of them. Over time, however, they became simpler and simpler, to the point where it is reasonably expected that any given adult will be able to drive a car. This is what is increasingly happening with computers. Some from the older generations will learn to adapt to the new technology, and some will not. But within our lifetimes, computer competence will be expected of people, especially when computers have become simple and ubiquitous. To an extent, this is already the case. However, the general expectation is not that anybody could write software (i.e. design a car part) or be able to fix computers that have suffered a serious malfunction (i.e. replace the cooling system). It's not even be that most people are expected to be able to handle routine maintenance on their own, hence the need for automatic software updates--you don't need to understand the details, just that you need to do it every so often. Just like changing your car's oil.

  • by mirkob ( 660121 ) on Friday December 05, 2008 @06:18AM (#26000757)

    Ummm... no. There are a statistically significant number of humans who aren't notably good at anything.

    to be more precise a lot of people are not good at anything that we know and value, they could be good at stupid things (soccer strategy) and never know it, never make others know and certainly never apply it to something good.

    but teoretically anyone could have some bright (but unuseful/unused) capacity.

    There are numerous species of "dumb" creatures that can be trained to memorize some task and then mimic (repeat) it perfectly ad nauseum... including H. sapiens. An ability to memorize and mimic doesn't equate directly with intelligence. It's a precursor, a prerequisite, perhaps, but not the Real McCoy.

    you forget a prerequisite of intelligence evolution, THE NEED FOR IT.
    if they could live their lives in a dumb way many lazy person never try to reach an higer lever of intelligence, never ever feel their lack of it!

    A shocking number of humans, including many regarded as "average" by testing standards, never actually reach a state of true intelligence. Too many of them are profoundly ignorant and quite determined to remain that way.

    the deepnes of their ignorance is an huge problem because they never perceive their ignorance, and many of the few that perceive it refuse to believe it and assault the condition that make them feel stupid.

    that is the sad problem.

  • Re:Can't hibernate (Score:4, Insightful)

    by master_p ( 608214 ) on Friday December 05, 2008 @06:23AM (#26000791)

    Does this happen in USA a lot? if a light in the fridge goes out, do you buy a new one? when a tire is blown out, do you buy a new car?

    Gee, and then some people wonder why Americans spend 50% of the global resources...

  • by crmarvin42 ( 652893 ) on Friday December 05, 2008 @08:33AM (#26001437)
    I'm in a similar boat with comcast. They keep charging me a rental fee for a modem that I PURCHASED 3 years ago. Every time I call they say that the office capable of checking the MAC ID against a list of their own to verify that it's mine and not theirs is closed and that they'll get back to me by the end of the next business day. Well it's been about a month now, another bill has arrived with the rental fee, and I'm still having zero luck getting ahold of anyone to address the issue.

    My only recourse now is to be the biggest A-Hole I can to get the attention of Higher-Up's, or cancel my service and switch to DSL for an extra $30/month.

    I've had similar problems with my credit card company trying to double my APR despite my reducing my overall balance, & never missing a payment with them or anyone else. Acted like a total jerk on the phone at it was taken care of with 15min. of my becoming a jerk.

    Being an a$$ may not be good karma, but it is more effective than being nice and taking it up the tail pipe.
  • Long before Windows, virtual memory was 'invented'. Given that, the term has a specific meaning. As others have mentioned, it is a method for making programs believe they have unlimited memory space, whilst sharing the actual available physical memory between numerous programs. This 'feature' has a cost - references to memory must be translated from a virtual address to a physical one, by memory management hardware (and sometimes software). Until most recently, Intel processors used a separate chip to manage this. AMD put their memory controller onboard a few years ago. In terms of memory performance, Intel lagged for the past few years because their outboard memory controller consumed extra time to do its job. Moving the controller onboard removes an electrical interface or two, thus speeding things up and generally improving efficiencies.

    The original post, I thought, was brilliant. Why are we devoting all this chip real estate (or, in the past, chips), to sharing a rare resource (memory) when that resource is no longer rare? Grant, virtual memory gives us other advantages such as ensuring one program doesn't write in the memory space of another, but surely there are other ways to do that. If we did away with virtual memory and returned to the old (ack! DOS) days of physical memory references, we could devote that chip real estate, power quota, etc to other worthwhile pursuits, like making my twitter pages load faster.

  • by jgranto ( 921808 ) on Friday December 05, 2008 @11:44AM (#26003235)
    I know I am late in replying, but maybe you'll look down this far. The simple fact is, regardless of how much RAM you have, there are some things that Windows wants to page. Period. So, you really have to situations when the system pages (grossly simplified):

    1. The system is low on physical memory (RAM). This is what people USED to encounter a lot. With RAM being cheap nowadays, this happens far less. When this happens, your system will slow down a lot, and it is considered a Bad Thing(tm). The reason the system slows down is that RAM is fast and disk is slow. Paging (aka using the swapfile) is a slow process compared to RAM. Similarly, there are wasted cycles as the system determines that it has to page, so this type of virtual memory use needs to be avoided.

    2. Paging as architected. Windows, regardless of the amount of RAM, will page certain things. Even if you had 32GB of RAM, it would still page some stuff. This paging should not affect performance.

    You are experiencing paging type #2, most likely. It is possible you are thrashing your system's 4GB of memory through the use of big applications, but unlikely. If Task Manager or perfmon show lots of free physical memory, you are paging normally and should not be concerned.

    Of course, it is possible that you have an application improperly paging data, artificially causing situation #1, which would be bad, but this is unlikely. If you see lots of disk access and experience system slow-down, this is an indication something is wrong.
  • by howlingfrog ( 211151 ) <ajmkenyon2002&yahoo,com> on Friday December 05, 2008 @03:06PM (#26005841) Homepage Journal

    Geeks often equate problem-solving skills with intelligence,

    It's certainly an oversimplification to say that problem-solving skills are equivalent to general intelligence, but it's also an oversimplification--a much more misleading one, in my opinion--to say that they're only one component of general intelligence, no more or less important than any other.

    There are a lot of different types and components of intelligence. Some of them work in parallel--if you have weak analytical skills but strong emotional intelligence, the former won't interfere much with the latter, and the latter won't make up for the former. Some of them work in series, though, and a chain is only as strong as its weakest link. Problem-solving in particular can be a bottleneck for many, maybe all, other components of intelligence. Imagine using a map to drive around an unfamiliar city. If you have strong spacial awareness but weak problem-solving skills, you're not going to have any more success than someone with the same problem-solving ability and much weaker spacial intelligence.

    If Alice has equally low abilities in all forms of intelligence, including problem-solving, and Bob has moderate to high ability in all forms of intelligence except that he's no better at problem-solving than Alice, then functionally, there will be very little difference between them. Improving their problem-solving skills by equal amounts would result in Bob being smarter than Alice, but I don't think it's accurate to say that he already is.

    The good news is that problem-solving is a learned skill, so it can be improved at any stage of life. The bad news is that the culture of the industrialized West over the last 50-60 years is uniquely awful at teaching problem-solving. Almost all of our entertainment is intellectually passive. We read books and watch TV and movies instead of telling stories; we listen to music instead of singing or learning to play an instrument; we watch other people playing sports instead of playing them ourselves; we go to museums instead of doing research. The only major form of entertainment that involves problem-solving is game-playing. Board games, card games, etc., are shrinking in popularity. Nearly everyone still engages in them sometimes, but the amount of time the average person spends with them is plummeting. Video games may be starting to pick up the slack, but the sorts of games where problem-solving is a major component are a minority. Guitar Hero and Madden are fun, but there's no problem-solving involved. The games that really have problems to solve are popular only within a small, slow-growing subculture.

    Even outside entertainment, we have little opportunity to improve our problem-solving skills. We work for mega-corporations where our jobs are designed to involve as little creativity and flexibility as possible. Decision-makers and problem-solvers are presented as elites who must be consulted whenever anything unusual happens. And as much good as ultra-specialization has done for the world, it has resulted in a situation where people believe that anything outside their own specialty is completely inaccessible--any time we have a problem that we haven't already memorized a solution to, we go immediately to an expert. We never try to solve any problems outside our own specialty, so we never learn how to apply existing skills to new situations.

  • by jon3k ( 691256 ) on Monday December 08, 2008 @01:53PM (#26035845)
    Windows never says that virtual memory is a memory paging file. They have a section titled "Virtual memory" with an OPTION for the size and location of the memory paging file, which is a feature of the virtual memory subsystem. I'm sorry if you don't understand the difference, I've explained it as simply as I can.

1 + 1 = 3, for large values of 1.

Working...