Are Cheap Laptops a Roadblock for Moore's Law? 335
Timothy Harrington writes "Cnet.co.uk wonders if the $100 laptop could spell the end of Moore's Law: 'Moore's law is great for making tech faster, and for making slower, existing tech cheaper, but when consumers realize their personal lust for faster hardware makes almost zero financial sense, and hurts the environment with greater demands for power, will they start to demand cheaper, more efficient 'third-world' computers that are just as effective?" Will ridiculously cheap laptops wean consumers off ridiculously fast components?"
I doubt it... (Score:4, Insightful)
Re:I doubt it... (Score:5, Insightful)
Moore's Law in Dynamic Equilibrium? (Score:2)
Let's consider the above phrase. There are many opposing forces to Moore's Law if we draw a free-body diagram. Some people don't want better computers as we hear "I just use it for e-mail", as there are those with little time of their own to be ambitious with a computer though they may use fairly powerful software at work. Then, there's competition from the third world, who before couldn't afford anything good may be able to buy a computer that has a built in UPS and wireless netwo
Re: (Score:3, Informative)
Re:Moore's Law in Dynamic Equilibrium? (Score:4, Insightful)
Re: (Score:2, Interesting)
Re: (Score:2)
If you put aside the gamer community, just how much power does a person need to surf the web, write documents, and slideshow family photos? I know my experience is purely anecdotal, but for years I was always aching to upgrade because computers used to be so darn slow. I don't feel that kind of pre
Re:I doubt it... (Score:4, Interesting)
There may at one time have been a feeling of power of being able to render the downloaded web page quicker or have a more responsive gui, but there isn't the same benefit with today's highest end models over a lower end model.
I remember drooling over the departments at work when they got new computers and ours hadn't arrived yet. Now, there isn't much that I need to go faster. Top of the line computers are no longer a status symbol because a bigger computer isn't that impressive, and you can't tell what kind of processor a computer has by looking at the outside, and nowadays, even by using it.
Of course it won't halt moore's law (Score:4, Insightful)
Besides, if cost were the biggest issue in computing, than Linux would be the ubiquitous desktop.
Re: (Score:2)
Correct, because the consumer market may shift in that direction.
Ignorant because TFA completely ignores the business market.
There will always be businesses who need the fastest, highest powered hardware available.
Re:Of course it won't halt moore's law (Score:5, Insightful)
Re:Of course it won't halt moore's law (Score:5, Insightful)
We may have unprecedented demand for low-power 200 MHz ARM processors these days, but we also have unprecedented demand for quad-core 2 GHz beasts in 1U rack-mount servers, so we can stuff more and more of them into vast underground data centers. Moore's law applies equally to the low end and the high end. Today we can put a powerful computer in a $500 iPhone, maybe tomorrow we can put it in a $50 iWatch. There's absolutely no economic reason for Moore's Law not to continue unabated.
Re: (Score:3, Insightful)
Since this is the norm when discussing Moore's "law", I'd rather see one of those mythical non-retarded arguments regarding it. There are none.
Re:Of course it won't halt moore's law (Score:5, Insightful)
It's every 24 months, not 18, and it has nothing to do with power or speed. CPU speed has increased at significantly higher pace than Moore's law. Moore's law views the number of transistor junctions in an IC, nothing more. The size, power consumption, MIPS, and other values have had significantly different curves, most at higher paces than the law, and not in direct comparison to transistor count. CPU power (in watts) over all is relatively the same as where it started in the 80s, and is currently reducing even as Moore's law increases. http://www.eng.tulane.edu/Tef/Slides/Tulane-Moore
Also, Moore's law clearly states that the number of transistors doubles "as costs remain the same." This means if we can have a $100 laptop today, in 2 years it will still cost $100 (or more accurately the portion of the $100 cost represented by the CPU will be the same), but the CPU will have 2X the number of transistors. It may be faster, maybe not. It may use more or less wattage. This is determined by transistor spacing, impedance layers (SoI, etc), volts, and clock frequency, not Moore's law. The articles premise is simply a logical fallacy.
One more thing: Moore's law does not apply to EVERY processor, only the leading generation vs. the predecessor. There's no reason to believe the notebook will use the current processor generation, and in fact likely it will not. This has no impact at all on the validity of the law as other processors will exist that follow the law. They may simply decide that instead of the build cost for the notebook being $90 to sell at $100, that they'll use previous generation hardware using more modern manufacturing processes, and reduce the build cost to $60-80, and still likely make it faster or better somehow in the process.
Were I a betting man, I'd put money on the $100 laptop not only having a faster chip with more transistors, but that it will use less watts, have a higher resolution display, faster or stronger wireless antenna, more storage, and more ports when we look at it in 2 years. Of course, part of the design of the machine, and it's low cost, is the intent of model line longevity. We don't expect to have a new one of these every 2-4 months like the retail PC industry does. Likely, this will be re-engineered at most once per year.
Re: (Score:2)
No, They are NOT (Score:4)
2 cents,
QueenB.
Re:No, They are NOT (Score:5, Funny)
Re: (Score:2)
Moore's Law Intact (Score:2)
I'm pretty sure Moore's Law will remain intact.
Re: (Score:2)
Business computing means running a word processor and a spreadsheet. Once people realise that what you don't need is Windows, then a PII is fine.
Re: (Score:3, Insightful)
Not fast enough yet... (Score:5, Insightful)
Re: (Score:2)
There will always be a need for faster, more capable machines. We'll want them to do new things, and we'll want them to do old things better. We'll want better-than-HD video uncompressed, delivered instantly, playable by a machine that's busy doing many other complex things at once. We'll want extraordinarily complex data indexing. We'll want lifelike 3D. We'll want artificial intelligence. We'll want compl
Gamers will always make Moore's Law Relevant (Score:2)
I was talking with our head of IT the other day. He is a serious gamer who just purchased a $500 USD video card. He buys the latest and greatest video card about twice a year (selling his old one on on ebay) and upgrades his motherboard once every two years. He has no plans to stop doing this. Ever.
Re:Gamers will always make Moore's Law Relevant (Score:5, Insightful)
And we all need suckers like him to buy the latest overpriced, overhyped hardware, so that we can wait a couple of years and buy the next generation for 1/10 the cost.
The "early adopters" get what they want - which is mostly "I want it now!" , and the rest of us get what we want, which is improved hardware cheaper by waiting a bit.
Look at the people who paid $500 for a 15" LCD screen with crap specs, when you can now buy a 20" for $150.00.
Same thing with video cards - they paid $500 for a card with a quarter-gig of ram - those cards are now under $100.00
Let them keep spending - the benefits trickle down to the rest of us because we're patient.
Re: (Score:3, Interesting)
I know what you are saying. I (very politely) explored that with him. Here was what he had to say to economically justify his gaming life style.
Re: (Score:3, Insightful)
The difference is that if you upgrade your card every 2 years, you still have your old one. If you upgrade all your hardware in the same fashion, you end up with both a new machine AND a backup machine that's 2 years old, and still has a lot of life left in it.
In the case of video cards, think dual (or more) displays as one use for a second, older card. I'm running dual at the office, and triple at home.
somebody doesn't understand Moore's Law (Score:5, Insightful)
Here's a thought - maybe those $100 laptops become cheaper, or more capable over time.
Re: (Score:3, Insightful)
Re: (Score:2)
Re: (Score:3, Informative)
multicore is becoming popular because instruction-level-parallelism has approached a practical limit, not capacitance. Basically processor designers are getting all these "free" transistors, and don't know what to do with them except add cores.
Processor speed limits come from heat generated by switching speeds, combined with heat from leakage current. Improved transistor density actually improves the heat generated by switching, but has to be balanced agai
Re: (Score:3, Informative)
Yes, that's true, but do not discount the effects of die capacitance. Each transistor presents a load to the signal, each interconnect presents a time delay and when you put them together, you have to overcome the problem of t
No. (Score:2)
Honestly, email and web browsing never required much past computers from, say, 1995. Is everyone using 12-year-old computers? No.
Moore's Law is the *enabler* for cheap laptops (Score:5, Insightful)
Contradictory Summary? (Score:2, Insightful)
And then asks: but when consumers realize their personal lust for faster hardware makes almost zero financial sense, and hurts the environment with greater demands for power, will they start to demand cheaper, more efficient 'third-world' computers that are just as effective?"
So Moore's law is good for going smaller/faster/cheaper, but the demand for s/f/c will spell the end of Moore's law?
Re: (Score:2)
I'd prefer going cheaper. The resulting processors would probably stay cooler, too.
Re: (Score:2)
No (Score:5, Informative)
Re: (Score:2)
CNET should be wondering if they actually know what Moore's Law is.
More strength (Score:3, Insightful)
Put another way: There are BAZILLIONS of cheap, ARM-based CPUs out there running everything from microwaves to kiddie toys. Have they put an end to Moore's law?
What actually MIGHT put an end to Moore's law is the actual quantum limits to computation. And we *will* hit those limits if we don't blow ourselves up first. But that's a ways off, and we may find some way past those limits as well. (EG: using other, N-dimensional space or something exotic that we can't even imagine yet)
Re: (Score:2)
At which point, as things are looking, should kick in right about the time quantum computing becomes feasable, and a whole new 50-year cycle of Moore's Law kicks in.
This isn't thermodynamics (Score:5, Insightful)
Will it hold up forever? Probably not, it could speed up or slow down by an order of magnitude as semi-conductor technology is replaced by The Next Big Thing (Optics? Quantum? Duotronics?), and our measurement criteria might have to change with it.
So again, the real story is that Moore's observation has held up so spectacularly so long. Lulls in performance increases are natural. But how does it plot over time?
Re: (Score:3, Insightful)
From dictionary.com: "A statement describing a relationship observed to be invariable between or among phenomena for all cases in which the specified conditions are met: the law of gravity."
The law of gravity has never yet been broken, but that doesn't mean it won't be. It's the same for Moore's Law.
While I'm sure it was called a 'law' initially as a jest (ala Murphy's law) it has
Re: (Score:2)
Moores law (Score:2)
Need to get the low energy magnetic memory spintronic processing train moving.
Or photonics.
External pressures (Score:4, Insightful)
Apart from the small percentage of hackers/enthusiasts who play with computers because they like computers, the majority of people use computers to achieve goals - be it to write their work documents, play games, edit photos etc. They will buy the machines that can run the software to do these jobs.
I can't see the big software players reducing the power requirements of their software as it upgrades. Microsoft Office 2015, Photoshop v.27, and World of Warcraft 2015 are going to need more rather than less power and people will be forced to buy more powerful machines.
Re: (Score:2)
"Microsoft Office 2015"
Do you really believe such a product will even exist? Or that it will do all that much more than Office 97? Software is hitting the same barriers as hardware - a lot of older software is "good enough", just as a lot of open/free software is "good enough." A lot of what were software monopolies are being encroached upon by "good enough", same as hardware.
Vendor lock-in is deteriorating - look at the resistance to 00XML (Microsoft's proprietary format) as opposed to ODF. The browse
Cheap laptops (Score:2, Informative)
People use their cheap underpowered laptop, get frustrated
If laptop makers didn't tempt consumers with their underpowered crap, maybe they would have a decent reputation. I don't see how Moore's law is affected.
Apple is the only computer manufacturer whose low end PC's actually perform tolerably.
Re: (Score:2)
Re: (Score:2)
Holy flamebait batman!
Sure, when you compare bottom of the pile apple gear to bottom of the pile pc gear, you're right, except you'd BETTER be right at the price difference for that comparison!!!
Try comparing apples and apples next time. Apple gear is expensive when comparing like hardware.
What the Hell? (Score:2)
One time effect? (Score:2)
Given that as of now, I can configure a fairly decent desktop (comparable to a sub-$800 laptop) for under $400 (including a monitor), the craze for cheap laptops might mean a resurgence in desktops for all we know.
And you can never count out the masses who spend $2000 + on their up-to-date PCs (whether i
Post is a Troll? (Score:2)
1. High performance server/business hardware will still be in demand.
2. Modern operating systems with all the bells and whistles that we're used to will need expensive hardware to run.
3. The trend is for home users to play video and audio which you just can't do (well) on a $100 computer.
Of course not. (Score:3, Insightful)
Moore's law is about transistors per area and cost per transistor. Cheap laptops have nothing to do with that.
But for the question that was *meant*, rather than what was asked... still no. There are some applications that can use basically unlimited computing power (and now, unlimited computing power with minimal electrical power), and everyone else benefits from developments geared towards those areas.
Machrone's Law (Score:3, Informative)
I doubt it (Score:5, Interesting)
My cellphone is now more powerful than the first computer I used. It supports up to 1GB of removable storage in about the smallest form factor I've ever seen (micro SD). It's built-in camera is as good as the first digital camera that I owned.
In other words, yes, people may start demanding smaller and more powerful devices - but so what? It just means that instead of speed doubling, power use might start decreasing, storage density might increase, who knows what. We're using computers for purposes I never would have dreamed of when ten years ago. I have a computer under my TV that records shows - I never saw that coming until it did.
Computers will continue to evolve. The laptop and desktop might start to fade out, but new devices will take their place.
Jeesh (Score:3, Informative)
http://arstechnica.com/articles/paedia/cpu/moore.
Instead of placing twice as many transistors on a cpu you can instead place twice as many cpus(a few less for the sticklers) of the same transistor count on a single wafer. Even if consumers no longer care about FLOPS they will still be swayed by lower cost, longer battery life, smaller dimensions and passive/quieter cooling.
The "$100 laptop" depends on Moore's law (Score:2)
Currency should factored (Score:2)
The problem is dollars are losing half their value every 3 years. A thing measured in dollars is going to become worthless faster than a thing measured in units that don't lose value. If you measure clockspeed vs. ounces of gold, you get a better relation between clockspeed and time than if you use dollars.
Unfortunately Moore wasn't an economist. He didn't understand th
Re: (Score:2)
Electronics sure don't follow that.
Food doesn't follow that. (Especially fast food.)
Gas does... But that's a special case.
In fact, let's give up on the specific examples. http://inflationdata.com/inflation/Consumer_Price_ Index/CurrentCPI.asp [inflationdata.com] That shows us that inflation is nowhere NEAR the 15% you claim it is. (3 years, etc etc.)
So yes, measured per dollar
Re: (Score:2)
If this were true why did my Hyundai Elantra 2001 cost about 100 USD less then my Hyundai Elantra 2005?
Of course (Score:3, Insightful)
Moore's Law Expanded (Score:5, Insightful)
"The number of transistors on an integrated circuit for minimum component cost doubles every 24 months"
Weather you keep the original 2 years or drop to 18 months, we're specifically referencing low cost components, which would map directly to the hardware they're trying to put in a $100 laptop.
So in short, no, a cheap laptop just helps to confirm Moore's Law, not derail it.
Re: (Score:2)
"Technology improves exponentially."
In other words, any given component is going to double in speed/capacity/coolness, and/or halve in size/price.
So, yeah, the $100 laptop is a confirmation, not an exception.
Re: (Score:2)
NO! See desktop precident (Score:3, Insightful)
That did not pose a roadblock for Moore's Law re: desktops, so why would it be the same for something comparable a quarter-century later?
All the price does is establish a bare useful^D^D^Dable minimum; Moore's Law just means that 25 years from now you'll be able to do on a $100 laptop then what you really want to do on it today - which still won't be useful then.
no (Score:2)
Furthermore, there's always the gamers...
I somehow doubt it (Score:3, Interesting)
I honestly don't think Moore's Law will go away (Score:2)
So who buys that new hardware? Well, no one specifically buys individual processors or memory chips or graphics cards or what
No, they aren't (Score:2)
Cheap laptops are leveraging advancement in computer technology in reverse. Think of it this way: A fast, high-end computer costs about $2000. A fast, high-end computer five years ago also cost about $2000.
So figure the new computer is 10x faster than the old one (I pulled that out of my ass). The idea is that something equivalent to the five year old machine can be built, today, for 10% of the cost of a new one using modern tech.
I wonder how much of a study... (Score:3, Interesting)
I would bet that, outside of the enterprise/gaming groups, tech 'upgrades' only happen because generally speaking with computers, only the latest and greatest are available.
I can't tell you the number of people I know who have purchased entirely new computers because they've become glutted with spyware, viruses, or have experienced a relatively simple hardware failure like an HDD spin-out or a dead RAM stick. Instead of dropping money on a replacement part and possibly installation services, they just buy a new computer.
And that comes with good reason too. Look at places like Dell. A $499 desktop isn't too bad at all. And I can promise that system will do everything that 85% of computer users will use it for. Most people don't play hardcore games. Most people don't use applications more processor intensive than productivity suites. Heck, for most people, the computer will be used only for email, Web, watching streaming video and maybe ripping their own CDs to put them on the iDevice of choice.
But that's the rub. At Best Buy or Dell or any of the retailers, even on their cheapest PCs, you're getting a pretty damn fast machine. You can't get an older/slower/cheaper desktop unless you're willing to buy old parts on Ebay and piece something together yourself.
For the big retailers, they can't even afford to keep the old hardware in stock, as storing it costs more than the retail value of the computer.
It really doesn't cost that much more to get a better computer with the current pricing structure. I wonder what would happen if all-of-a-sudden people could get a $150 laptop capable of Web, word processing, basic networking and email?
Remember how wildly successful Wal Mart was with the $35 DVD player a bunch of years back? It worked because it was so cheap that people either didn't demand top quality, or realized that they didn't need the $1,000 Sony 5-disc DVD changer with DTS surround and optical outputs.
Consumers (Score:4, Insightful)
Re: (Score:2)
Maybe not the OLPC, but the Nokia E61 is a $100 computer ideal for the average /. reader. I should know, I have
one, I am pretty average, and I am reading /.
Where have you gone Subrahmanyan Chandrasekhar? (Score:2)
Actually we have some evidence of what might happen from the PDA market. Equivalents to the first or second generation Palms should exist at well under the $50 mark, but they don't. Instead, PDAs have become more complex in an effort to keep most of them up at $200 mark, with the Palm Zire holding out as an overpriced bargain at $99.
This is what I think is behind convergence. Convergence isn't really all that wonderful, but the marg
Suggested amendment to Moor's Law (Score:2)
"Except where the added performance will have no impact on the usage"
For situations where the end user REALLY wants or needs the speed, like PC gamers, dedicated game consoles, science, engineering and other applications where the increase in speed will have an impact. I think Moore will continue to apply for the processors used in those systems.
However, in situations where the added speed will have
On the Misuse of Terms (Score:2, Insightful)
The former clause above may be true, but that is still up for debate. As stated, there still exists a very thriving market in the enterprise, media production, and gam
perhaps (Score:2)
Sure, there are still guys pounding away on an old C=64 because it's what works for them. Notice that there aren't too many of them tho.
Sooo lame. (Score:2)
Smear campaign?
Don't run Windows & don't flame me (Score:2)
And, in ensuing 10 years, that old clunky desktop could EASILY be miniaturized down to a laptop size or smaller. Doesn't anyone remember that the Timex Sinclair digital watch had more compute power than early System 360 mainframes, the Apollo onboard computers etc etc???
Maybe I'm an extreme case but until a few years ago I had an old IBM PC750 desktop I had upgraded all the way up to a wh
I can remember (Score:2)
Do I need a nice fancy car when a little econo box will go
No. Here's why: (Score:2)
Just as moores law just recently made JavaScipt driven browser based productivity software a feasable alternative. What would've you said if someone told you that 5 years ago? You'd've called him a nutcase and so
This story sounds like it was written to annoy (Score:2)
On a separate point, to say that a computer is "ridiculously fast" is incredibly small minded. Today
WTF? RTFA! (Score:2)
Do you even understand it? (Score:2)
The fact that this makes faster chips is a by product of this 'law'
This means that you can have smaller and more economical chips.
When you also take into account that previous generation chips are still very powerful, and cheaper to make it is a boon to cheap laptop; which, by the way, is why we can have cheap laptops in the first place.
jesus. (Score:2)
Remember the "RAID CD-ROM" question?
Actually it helps Moore's Law (Score:2)
Misunderstanding Moore's Law? (Score:3, Informative)
It doesn't matter whether you get twice the performance for the same price, or the same performance for half the price (and half or less the power usage), you're still following Moore's Law.
The really interesting thing is that Moore's Law applies to everything we make. The doubling time depends on the technology, but the best performance-per-unit-price for every technological product from oxcarts and clay tablets to rockets and ebooks can be shown to follow an exponential curve back as far as we have hard enough figures to plot meaningful points.
theoretical vs practical (Score:3, Informative)
Not that they automatically are incompatible, but Moore's law seemed to pace "research" a lot better than market, ever since I first heard of it...
The low-cost laptop units are among the first units I've seen to approach what customers really want, as opposed to what manufacturers want... Meaning the olpc won't be "necessarily" obsolescent in a year... And even if it was, people would(wisely, I might add) refuse to pay another 100$ next year...
Which isn't to say bundling a low-cost laptop, with say, internet service(as I've heard bandied about) might not work...
Clueless Wankers and CNet.UK (Score:3, Interesting)
Maybe these same consumers will also realize that Moore's law also means that in 18 months you will be able to do the same computational work at roughly half the power cost (modulo leakage current, of course), a fact that appears to escape the razor wits at CNet.UK!
Moore's law is the only reason that we now have $5.00 calculators running off of solar cells generating a few miliwatts from ambient light, or $10.00 quartz wrist watches that run for years off a single button cell. If anything, the $100 laptop will accellerate Moore's law by increasing the volume of products produced and resultant economies of scale.
The folks at CNet.UK are a bunch of clueless wankers.
No. (Score:4, Insightful)
Re: (Score:2)
Will the availability of ultra cheap computers stiffle Moore's Law?
Answer is... NO!
Re:No... (Score:5, Insightful)
The 286 processor was called a "supercomputer on the desktop", way too much power than what the average user will ever need.
It's not just the alienware crowd, once your average user gets a taste of what can be done with more power they will jump on the bandwagon too.
As somebody here mentioned in another post, video encoding and editing requires quite a bit of power, and this may become more mainstream with cheaper and cheaper camcorders. The personal computer is constantly expanding beyond the glorified word processor and their will always be new applications that come along that require more power, and it is kind of short sighted to believe that future apps will be nothing more than improved versions of only what exists today.
Re:No... (Score:5, Insightful)
Very few people want to actually *DO* anything anymore, other than be entertained.
Re: (Score:2)
Because attitudes are changing, slowly but its picking up speed.
Ask a Prius owner what they love about their car. One of the first things out of their mouths will be "it gets great gas mileage"
People are beginning to want value for their money because they have less (m
Re: (Score:3, Insightful)
It's about the software. (Score:3, Insightful)
As processors have gotten faster, a certain set of developers have migrated to slower and slower languages to create applications; others are guilty of using less care to optimize for speed for the same reason. Operating systems too; Vista is a good example of an OS that is, frankly, a real pig.
As machines get faster, they can do things like run an application in an interpreted environment and still not seem too sluggish. The press has (correctly) pointed out that the current trend towards multiple core
Re: (Score:3, Interesting)
This argument is basically the "my r
Re:It's about the software. (Score:5, Interesting)
I remember a long long time ago, I was surfing on a puny little 96mb 200mhz Pentium. The World Wide Web may have changed a bit since then, but it's still just a bunch of text with a few pictures mixed in. A quad-core 3.2ghz monster doesn't do it 64 times faster today, instead we throw more garbage at it to "make use" of the extra power.
The problem with Moore's law is simple: computers may evolve quickly, but humans sure don't. We're as dumb as we were ten years ago. Life on earth is pretty much the same as it was before, it just costs more money now. We consume more and more, and produce less and less. Why aren't these "thinking machines" doing our work for us ? Productivity is supposed to have increased, but what have we done with the excess ?
If anything, cheap laptops are a roadblock to progress. We're right on track to becoming telecom slaves, just the way they want us.
Re: (Score:3, Interesting)
Or maybe because computers are faster, developers don't HAVE to optimize as much to get acceptable performance and can use slower, interpreted languages that are quick to write and debug. It also means the applications can be more capable and include more features. Some call this bloat, some call it progress. History has consistently shown it to be the latter. Development speed has increased as a result, meaning cool shit gets in our hands faster. How is that a bad thing?,
No, history has consistently demonstrated over the last 10 years or so that it is bloat. My three year old computer is pretty sluggish when I am doing anything even the most trivial in XP, vista is a bit better, but right now writing this on FreeBSD and I can actually expect to open all but the most processor intensive applications alongside my browser and things still go smoothly. I could, after stripping things down, run this on a 486sx as well. I wouldn't consider it fast, but I would just not get as mu
Re:What a pointless article. (Score:4, Insightful)
1. Stop calling it 'Moore's Law'.
2. Stop panicking when a good reason for the 'law' to be invalidated shows up.
Sheesh, who really gives a shit anyway. Moore's Law is not driving the processor industry, there are plenty of other incentives for continual product improvement.