The Most Incorrect Assumptions In Computing? 1496
Miss Muis writes "After reading once again that Moore's Law will become obsolete, I amused myself thinking back to all the predictions, absolutes and impossibles in computing that have been surpassed with ease. In the late 80s I remember it being a well regarded popular 'fact' that 100MHz was the absolute limit for the speed of a CPU. Not too many years later I remember much discussion about hard drives for personal computers being physically unable to go much higher than 1GB. Let's not forget "I think there is a world market for maybe five computers" from the chairman of IBM in 1943, and of course 'Apple is dying...' (for the past 25 years). What are your favorite beliefs-turned-on-their-heads in the history of computing?"
Cringly (Score:2, Interesting)
He's my bellweather.
The truth might be out there (Score:3, Interesting)
I've never seen a citation of the Bill Gates 640K quote, or the market for five computers quote, for example. They sound reasonable, but so are lots of supposed "facts".
If you ask Ray Kurzweil he might say (Score:5, Interesting)
About Kurzweil [kurzweilai.net]
A little Googling and: (Score:5, Interesting)
-- Bill Gates, from "OS/2 Programmer's Guide" (forward by Bill Gates)
9600 baud (Score:4, Interesting)
Computers make life easier? (Score:5, Interesting)
I remember working at a research firm for an internship, and the head of our department said over lunch one day that he actually spent more time dealing with problems he was having with his computer than actually doing any useful work. I've noticed this with myself also, and even though I enjoy figuring out what's going on with my computer, I imagine many people don't. Email and websurfing always suck away my working hours, what with a PC right here on my desk, and not to mention that I get asked to help other people out with their machines every once in a while, it wastes both our time.
Makes me think though...wasn't it always implied that computers would save peoples time? Has that assumption yet proved that it is indeed true? I'm not so sure it has, although maybe that's because we aren't using the things the right way. Perhaps we are waiting for a computer savvy workforce and then this might be true...but then again, who knows...
my personal fav (Score:1, Interesting)
now, almost 7 years later, i'm unemployed, taking more classes, and trying to compete w/ developers in India making 1/5th of what i was making employed.
www == widespread wealth wipeout
40MB Hard Drive is Plenty (Score:5, Interesting)
I remember telling my father once after he had bought a 40Mb hard drive that this should last him forever. Nothing could ever fill up more than this. Of course this was well before the days of .mp3 and .mpg.
When I was a kid, I remember watching the Jetsons and when George came home from work he coomplained that he had just finished a hard day at work pushing buttons. I remarked to my father that Noone could ever get a job where all they did was push buttons all day. Now, except for the one knob on the 'scope under my desk, all my interfaces to the outside world ARE buttons.
I guess I'm full of underestimations...
One year from now... (Score:5, Interesting)
It's been ten year that I hear this statement continuously. Last time I broke the MBR on a server without a CD drive, I had no other choice than to boot on a floppy.
Apple is dying... (Score:2, Interesting)
there's always the standard... (Score:3, Interesting)
--dw
Re:Bill Gates once said... (Score:1, Interesting)
LISP is dead (Score:1, Interesting)
"Lisp doesn't look any deader than usual to me."
- David Thornley, reply to a question older than most languages
Remember the failures (Score:4, Interesting)
"You won't have to work, machines will do everything for you."
Flying Cars !
Isn't it interesting that the only the failed predictions are the ones that people remember - no matter if they are exceeded or undershot.
Its almost as if, if you want to be quoted and remembered, you need to make high sounding, but wrong predictions. The more smug the eventual reader, the more notice they take.
History, here I come.Great Heinlein-ism (Score:5, Interesting)
I'm still waiting... (Score:2, Interesting)
"In from three to eight years we will have a machine with the general intelligence of an average human being. I mean a machine that will be able to read Shakespeare, grease a car, play office politics, tell a joke, have a fight. At that point the machine will begin to educate itself with fantastic speed. In a few months it will be at genius level and a few months after that its powers will be incalculable." -- Marvin Minsky, LIFE Magazine, November 20, 1970
my favorites (Score:5, Interesting)
"I don't understand why people would need more than 4gb..." (Bill Gates in an interview on 64 bit ccomputing, in which he said he didn't understand peoples' interest in it)
XML will replace relational databases
OOP will lead to more robust, easier to maintain and higher quality software
By making COBOL resemble English, anyone can program.
Re:Video hardware... (Score:2, Interesting)
Machrone's Law (Score:3, Interesting)
"The computer you want to buy will always cost $5000"
Now you could get 10 PC's for that.
Dot-Com bubble (Score:3, Interesting)
Does anyone remember the whole Dot-Com Bubble [fuckedcompany.com]?
Billions in venture capital were sent to silicon valley back in the late 90s in the hope that anything and everything internet-related could be profitable, and were worth investing in the same style that brick-and-mortar companies were. We heard all kinds of great things from leading economists who were really misleading us to manipulate the market, short the stock, and fuck everyone else over. Then, in 1999, after the Microsoft ruling, the whole thing kind of collapsed.
As for today, just a few of the giants of e-commerce stand... so many companies went out of business on the predictions not far off from the ideas that we'd have groceries delivered to us over the internet (WebVan [wired.com]) or that we could actually stream TV-quality video over 28.8 kbps (Pixelon [wired.com]). It's never going to happen again, so the golden age of marketing ideas on the internet and obtaining massive capital influx is over.
Re:Home Computer (Score:5, Interesting)
The second worst would be that we would be in a paperless society. Uhm, yeah, unfortunately some shmock invented wysimolwyg PRINTERs too.
Other than that, I see new predictions fail all the time, and even being reinvented. Who else remembers the "Gorilla Arm Syndrome" of the 80's with touch screens? They were predicted to take over, but that didn't happen. And it ain't happening now either, with the flatbed computers -- touch screens just aren't ergonomic enough for any prolonged use, as most people can't keep their hands in the air for any length of time.
Same with gyroscopic mice -- they're going the way of the Dodo, despite happy predictions.
Regards,
--
*Art
Good Times (Score:5, Interesting)
Luckily Microsoft proved that assumption was false.
Re:Ken Olson of DEC (Score:2, Interesting)
"Unix is snake oil"
Comment removed (Score:3, Interesting)
Erroneous Beliefs (Score:2, Interesting)
I remember a few silly beliefs some folks had when I was/we were young. The most remarkable thing is that some of them were "verified" by "scientific experiments" by various people.
1. That burning a copy of a CD resulted in a slightly degraded image of the data. A classmate thought he verified this by copying copies until they failed. He came up with the figure that it took seven iterated burns (on average) for the degradation to make the copy unreadable(!!!). I guess some people don't understand causation and/or the law of averages and/or hardware reliability. This was in the days of turning off the music and not touching the desk while the CDR was burning.
2. That data can just be compressed again and again (.zip,
3. That compressed data was "more prone to read failures" than uncompressed data, by virtue of "the data being closer together on the disk". Although this might sound more ridiculous than #2, it really isn't. I fell for this when I was very young, as it seemed to be empirically verified. Heh.
It is kind of fun to reflect on how all of these fallacies are due to extending what is intuitive about the real world, into the world of information and digital representation. We'll see how many current silly beliefs of ours (P!=NP?, "{absolute security|quantum computation|...} is (im)possible", &c.) have elegant refutations which we will hopefully discover in my lifetime. Remember, no one understands the world of quanta and bits yet, and that the opposite of a profound truth may be another profound truth.
I would rather point to Alan Turing (Score:5, Interesting)
Japanese 5th Generation Software would dominate (Score:5, Interesting)
Since at the time, they had finished doing just that with consumer electronics industry and were well on the way to doing just that to the automotive industry, most CS types were justifably concerned.
Well, the rest of the story is that it didn't happen. Not even a whimper of it got over to the western world.
Re:There have been some real humdingers... (Score:5, Interesting)
"There is no reason anyone would want a computer in their home." - Ken Olsen, Founder, Digital Equipment Corporation
I don't think Watson's quote really fits into these sorts of discussions because the entire nature of what a 'computer' is was entirely different when he said it.
Olsen's quote, however, is simple lack of vision since he was addressing fairly modern era PCs directly.
Cray (Score:2, Interesting)
Someday everyone will have a Cray on their desk...
This of course has come to fruition, but the corollary:
fortunately is not true!
Correction: (Score:4, Interesting)
I just know that was driving you all nuts.
Dvorak on the mouse (Score:5, Interesting)
Re:Bill Gates once said... (Score:5, Interesting)
Ok, Gates claims he never said it. Great. I'd leave it at that, but I went to a talk he gave at the University of Waterloo in 1989, and he did meekly accept responsibility for that quote. We all politely chuckled, and the talk went on.
I could easily be mistaken, as that was quite a while ago, but I distinctly remember it as a mea culpa.
Re:The death of x (Score:3, Interesting)
I'll say this though: 8-tracks, betamax and vinyl records appear to be quite dead (said in my best Munchkin voice).
Re:There have been some real humdingers... (Score:5, Interesting)
IBM continues to be one of the leading (if not the leader) computer companies, and as a business has been around for more then a century, and has always been profitable. They clearly have recovered from a momentary laps in judgement, which, in historical context can be forgiven.
DEC, on the other hand.. Well, Olsen was a dumbass, plain and simple. He also is quoted as saying "Unix is snakeoil". What is amazing is not that DEC got swallowed up by Compaq, a companies whose core business is putting computers in peoples homes, but that they managed to survive as long as they did with morons like Olsen at the heml.
People will never copy ... (Score:3, Interesting)
People will never copy full DVDs over the Internet, it is way too big and would take days by conventional broadband (read: 128K ISDN).
-- that is for bandwidth.
People will never be able to copy CDs, they are unreadable on computers except in audio D-A conversion.
People will never be able to copy DVDs, they are encrypted with CSS.
-- that is for format.
People will never be able to copy GameCube games, they are on their own proprietary format discs.
People will never be able to copy PSX/2 games, they have heavy protection.
People will never be able to crack the XBox protection.
-- This is for the consoles
And my #1:
This format is the next revolution! Jump in the bandwagon now!
Mike
Re:Bill Gates once said... (Score:5, Interesting)
I've used a computer that had 900K of memory and ran MS-DOS just fine. All of it was conventional memory. No tricks.
The 640K limit comes from the following architectural limitations:
(1) Intel 8086 physical addresses are 20 bits long.
(2) IBM partitioned the 1 megabyte address space into 640K of memory space, 384K of device space.
Other manufacturers made MS-DOS computers that were not PC register compatible. Some of them did allocate more of the 1024K address space to memory. MS-DOS works just fine up to the physical addressing limit of the 8086.
Back around 1981, I read a Byte article about the new IBM PC which said that it had a gigantic memory space. And they were right! Filling up that 640K would cost about $5000 at the price of memory back then. I think it's reaasonable for a personal computer to have enough address space to handle $5000 worth of memory (especially when $5000 in 1981 dollars is worth quite a bit more than $5000 in 2003 dollars).
Are you using a 64-bit desktop yet? Because if you're not, your 2003 desktop computer can't handle $5000 of memory!
Macintoshes are more stable (Score:3, Interesting)
Another example would be the quote "Apple's-- they just work". How many minor updates have hosed people's systems since OS X has come out, what about Keynote 1.0 causing kernel panics, all the former issues with iTunes, Safari, and all the other apps out there.
So yes, I do use a Mac, and I do like it-- but it's time everyone knows the truth-- Mac's have many problems too, especially if you use new software... the bugs get worked out eventually, but usually at the users expense.
Re:Paperless office, bah! (Score:3, Interesting)
The latest paperless project is electronic faxing through zetafax.
Re:Bill Gates once said... (Score:1, Interesting)
Re:Paperless office, bah! (Score:2, Interesting)
Leisure society (Score:5, Interesting)
- my grade 10 high school teacher19 years ago
ESR 'Absurdly Rich' (Score:2, Interesting)
I looked on ESR's vanity page, and NO he doesn't have this clinker listed as one of his essays.
Really bad predictions (Score:2, Interesting)
2. OO represents "real world"
[When did the real world start using 'CommandContainerFacade.getEventProducerFactoryCr
3. There is a magic product out there that solves all problems.
[yeah sure, maybe in million years!]
4. Methodology X is panacea. [see Usenet]
Also see Anti-patterns catalog [c2.com] for other examples.
Pascal is the future... (Score:2, Interesting)
"Pascal is the language of choice for all future software development at Apple. If you want to write software for Apple computers, all of our development tools will support the Pascal language only. We both need one standard language to develop in and support, and we have chosen Pascal as the most popular and best language for development." (Or words to that effect)
This was said by one of the technical suits at Apple at the time who's name escapes me. The 'conference' was actually a 2-3 hour presentation on a Saturday afternoon. It was sparsely attended (maybe 200 people total), which only filled the auditorium to about 20% capacity. A personal highlight for me was running into Steve Jobs in the hallway and having a chance to shake his hand and chat with him briefly, which was no small feat considering he already had a squadron of bodyguards.
Obviuosly, the 'Pascal' proclamation was dropped within months. But it was encouraging to hear them acknowledge and attempt to support the needs of third-party developers.
Re:If you ask Ray Kurzweil he might say (Score:2, Interesting)
5 years from now... (Score:4, Interesting)
everyone will be running free GNU on their 200 MIPS, 64M SPARCstation-5."
Andy Tanenbaum, Creator of Minix
30 Jan 92 13:44:34 GMT
Andy wrote this during the "Linux is Obsolete" debate between Linus Torvalds and himself back in '92.
Desktop Computers are over! (Score:2, Interesting)
World Market for 5 computers... (Score:5, Interesting)
What he *meant* was "There's a market for 5 really high-end machines far and above the rest of the competition". The word "supercomputer" wouldn't be around for a few decades yet. And what do you know? Even today, there's a small handful of machines at the truly high end (currently, above 5 teraflops or so)
Re:Didn't he also say: (Score:1, Interesting)
I forget what magazine said that around those days, Bill gave a speech at some trade show. A reporter asked him if Microsoft planned to create an Internet division. He said that was as silly as if a computer company were to create an "electricity division", and basically laughed off the idea. A few months later, what do you know, MS announced their new Internet division...
Re:One year from now... (Score:5, Interesting)
She loves her USB drive. In the past, when she wanted to bring work home (which is very often) she would either put it on a zip disk (which are too damn slow and are not reliable) or burn a CD, which was reliable but took too long. Floppy was out because the file was too big (MS Access database). Now she just drags the file to the "removable drive" icon and she's done. It's USB 2.0 (the fast one -- er, is that fast or high speed?), so it copies damn fast.
Oh, the system can be booted from USB or CD, so crash recovery is still possible.
Is that the fault of computers, or of programmers? (Score:4, Interesting)
I think a computer of today would have more than sufficient processing power and storage space (particularly if it can do live Internet searches as an "extended memory") to imitate a human - there's just no capable program.
Think about how you eat an apple. No, I wasn't really thinking about the chewing process, you can express that. Express how your body knows how to decompose the apple into various nutrients, absorb those into the body, deliver them to where they're needed, the chemical processes used to transform them into energy for our bodies, and how the byproducts are returned to the waste system, probably filtered by the kidneys and whatnot. Maybe now you can, if you're a doctor of medicine, but otherwise not. And people live and eat apples just fine without knowing.
On the other hand, if you wanted to design an artifical digestive system, you'd need to know all that. In short, you'd have to know a damn lot. In the same way, humanity is pretty much stuck when it comes to describing how a human mind works. It doesn't help you at all that you see the brain in function every day, no more than you see a man chew and swallow an apple. There's simply no way to build artifical intelligence until we understand human intelligence. And when it comes to that, we're still way off.
Kjella
Re:Bill Gates once said... (Score:4, Interesting)
A great (bad) assumption (Score:3, Interesting)
Another bad assumption made, that my coworker just said, was "the Knapsack crypto algorithm is secure." The knapsack algorithm was a public/private key crypto system that was very elegant in the design and speed, but was eventually broken (on an apple ][, even).
The "640K" Quote (Score:3, Interesting)
The source of the quote was Steve Jobs, questioning Steve Wozniak's suggestion to build the "Language Card", the 16K memory card that took the Apple II/II Plus from 48K to 64K.
Jobs' actual words were, "Why would anyone ever need more than 48K?" Not 64K, as assumed by the first misquoters, based on the maximum direct addressability of 8 bit processors, and not 640K as assumed by those who decided to misattribute the quote altogether.
Jobs was always questioning Woz's technically oriented decisions, and frequently making the opposite decision when he had the power to do so. For example, he argued that there was no reason to build color into the Apple II. Woz did it anyway. When Jobs got the chance to make a similar decision, he went against Woz's reasoning, and even against the advice of others under him when making them. Hence, the original Macs, and several versions after, were strictly monochrome.
I'd like to think Jobs learned his lesson after ignoring someone's advice not to hire "some soda pop selling suit" and losing control of his company for 10 years. But I could be wrong.
Anyway, that's what I recall from my old "SoftTalk" and "The Road Apple" days.
You're not kidding (Score:5, Interesting)
I can't believe it (Score:3, Interesting)
When I was graduating high school it seemed the conventional wisdom was "In the future, everything will run on java anyway"
This was just about the time I was getting into computers heavily, and I remember you couldn't buy a computer mag without having JAVA somewhere on the cover.
Re:Computer games... (Score:3, Interesting)
Re:I Invented... (Score:2, Interesting)
Basically, it's a stupid argument. Anyone who has taken the time and bothered to actually look up what he really said will realize, immediately, that what he said made sense in context, was true, and that all the hoopla in the media was propagandist rubbish. Not that there's anything wrong with that -- that's politics. What is irritating is all the morons who don't know what they're talking about and yet still insist on sharing their uninformed opinion with the rest of us as if it was worth anything.
here's three more (Score:1, Interesting)
- Ken Olson, president, chairman and founder of Digital Equipment Corp., 1977.
"DOS addresses only 1 Megabyte of RAM because we cannot imagine any applications needing more."
- Microsoft on the development of DOS, 1980.
"Windows NT addresses 2 Gigabytes of RAM which is more than any application will ever need".
- Microsoft on the development of Windows NT, 1992.
Re:Bill Gates once said... (Score:2, Interesting)
Re:Proof that Moore's Law will come to an end (Score:5, Interesting)
What isn't yet clear is just what error correction itself means. Could a designer get a bit smaller scaling, but only by making the chip unable to run any existing programs? Could we turn quantum effects to our advantage with what is called Quantum based computing? Will Intel or IBM want to make a computer that needs a completely different approach to writing every last bit of software it can run?
The answers to the first two questions are unknown. The third, however, is an obvious NO! Mor's law will stop, either because we can't make the switches any smaller, or because we stop using transistors.
Re:I Invented... (Score:5, Interesting)
The question was 'What have you accomplished in congress?' or something similar. So now let's look at his response in that *CONTEXT*.
Did Al Gore take the initiative IN CONGRESS in creating the internet? You bet he did! In fact, Newt Gingrich said that if there had been no Al Gore, there would be no internet as we know it today. (Of course, that was a few years ago. But still.) He was the prime mover behind getting funding for it. And without government funding, the internet would never have grown like it did, and may well still be some strange, escoteric thing that connects a few universities together... and AOL (or *shudder* MSN) could be the 'Information Superhighway'.
So, you can still say that since he didn't explicitly SAY 'in Congress' in response to the question about what he did in congress, he was actually claiming to have invented the entire internet from scratch. But at that point, anyone with an ounce of intellectual honesty would have to admit that this was a 'lie' that was created entirely by the press and was perpetrated on an American public that is instantly ready to believe anything they hear, as long as it's bad.
-fred
"Push Technology" (Score:2, Interesting)
For me this was another example of consumers ruling the marketplace with an iron fist. You can't get us to drive Edsels, drink New Coke, or subscribe to Active Channels, no matter how much money you have.
Definition of download (Score:4, Interesting)
Continuing haystor's beer analogy [slashdot.org], the remote machine is called a server.
Your machine requests something from a stationary location. That is a pull operation, and is called "downloading", (such as requestnig a drink and being given a beer.)
Your machine sends something of yours to the stationary location. That is a push operation, and is called "uploading", (such as giving money to the bartender.)
The remote machine responds to each request. It is "serving", (such as the bartender taking requests and returning drinks, also known as serving.)
---
Another poster suggested that the definition has to do with the size of the machines, but this is obviously incorrect. If a 300lb man gets a beer from a midget bartender, the man is still doing the requesting and the bartender is still serving.
Or think about P2P networks. The machines can be considered to be equivalent, but a computer with a 2GB hard drive and only 10 files still serves those files to the computer with a 200GB hard drive and millions of files. The latter computer is doing the requesting and "downloading".
The confusion may be because your ISP is limiting your upstream or "upload" bandwidth, which is used for the transaction whether you are serving (also known as sharing) or uploading (also known as posting) the files, even though that bandwidth is also used for requesting. English is great; the last sentence had five words for the process where bits move from your computer to another.
Re:Proof that Moore's Law will come to an end (Score:2, Interesting)
As for going 3-D, even sticking to your strict definition of Moore's law, we'd still be fitting more transistors in the same chip area, so it'd count.
3 years is a lot of time for stuff to happen, I suspect we'll get at least one more "free" doubling just from a leap in transistor design.
More general, though, if some other technology comes along and we start using carbon transistors, or optical switches, or some more esoteric technology that allows us to do twice as may calculations in the same amount of time on a certain sized sliver of what-have-you, that's still going to be called Moore's law if it follows the pattern continuously, regardless of how hard you hold onto the stricter definition. If we shift to a new technology, and computing continues to double, no-one will be claiming that Moore's law is dead.
If, not when. It could be that silicon transistors are the best we're going to do, and Moore's law (in either the stricti or the lax form) only has another few decades in it. It certainly has to peter out sometime.
From Popular Mechanics (Score:3, Interesting)
Re:the list (Score:3, Interesting)
[breaths]
I'd like to see a list of some of the competitors before I proclaim it THE funniest post. Perhaps it would be a good topic for a poll.
Microsoft collection (Score:3, Interesting)
My favorite ones:
Re:Bill Gates once said... (Score:3, Interesting)
From HelpPC 2.10, by David Jurgens (emphasis mine):
- power supply starts Clock Generator (8284) with Power Good signal on BUS
- CPU reset line is pulsed resetting CPU
- DS, ES, and SS are cleared to zero
- CS:IP are set to FFFF:0000 (address of ROM POST code)
- jump to CS:IP (execute POST, Power On Self test)
- interrupts are disabled
- CPU flags are set, read/write/read test of CPU registers
- checksum test of ROM BIOS
- Initialize DMA (verify/init 8237 timer, begin DMA RAM refresh)
- save reset flag then read/write test the first 32K of memory
- Initialize the Programmable Interrupt Controller (8259) and set 8 major BIOS interrupt vectors (interrupts 10h-17h)
- determine and set configuration information
- initialize/test CRT controller & test video memory (unless 1234h found in reset word)
- test 8259 Programmable Interrupt Controller
- test Programmable Interrupt Timer (8253)
- reset/enable keyboard, verify scan code (AAh), clear keyboard, check for stuck keys, setup interrupt vector lookup table
- hardware interrupt vectors are set
- test for expansion box, test additional RAM
- read/write memory above 32K (unless 1234h found in reset word)
- addresses C800:0 through F400:0 are scanned in 2Kb blocks in search of valid ROM. If found, a far call to byte 3 of the ROM is executed.
- test ROM cassette BASIC (checksum test)
- test for installed diskette drives & FDC recalibration & seek
- test printer and RS-232 ports. store printer port addresses at 400h and RS-232 port addresses at 408h. store printer time-out values at 478h and Serial time-out values at 47Ch.
- NMI interrupts are enabled
- perform INT 19 (bootstrap loader), pass control to boot record or cassette BASIC if no bootable disk found
Hard Disks have stagnated (Score:2, Interesting)
300 GB is still tops, same as last xmas. A minuscule growth in laptop hard disks, 12 months ago 60 GB, now 80 GB.
I don't recall stagnation like this happening *ever* before.