The Most Incorrect Assumptions In Computing? 1496
Miss Muis writes "After reading once again that Moore's Law will become obsolete, I amused myself thinking back to all the predictions, absolutes and impossibles in computing that have been surpassed with ease. In the late 80s I remember it being a well regarded popular 'fact' that 100MHz was the absolute limit for the speed of a CPU. Not too many years later I remember much discussion about hard drives for personal computers being physically unable to go much higher than 1GB. Let's not forget "I think there is a world market for maybe five computers" from the chairman of IBM in 1943, and of course 'Apple is dying...' (for the past 25 years). What are your favorite beliefs-turned-on-their-heads in the history of computing?"
My Personal Favorite... (Score:5, Insightful)
Totally untrue. *BSD rules.
This site... (Score:1, Insightful)
hehe
Home Computer (Score:5, Insightful)
Al Gore (Score:3, Insightful)
"GORE: Well, I will be offering -- I'll be offering my vision when my campaign begins. And it will be comprehensive and sweeping. And I hope that it will be compelling enough to draw people toward it. I feel that it will be.
But it will emerge from my dialogue with the American people. I've traveled to every part of this country during the last six years. During my service in the United States Congress, I took the initiative in creating the Internet. I took the initiative in moving forward a whole range of initiatives that have proven to be important to our country's economic growth and environmental protection, improvements in our educational system.
Shamelessley pulled from here [sethf.com]
Re:Bill Gates once said... (Score:5, Insightful)
Here's a Wired.com article [wired.com] with some more details
Re:My favorite lie (Score:4, Insightful)
"Linux is good enough right now for the desktop."
is being laughed at right now.
My favorite... (Score:5, Insightful)
THE INTERNET IS GROWING TOO FAST, AND WILL COLLAPSE UPON ITSELF PRESENTLY.
I think he just wants everyone to know that he invented Ethernet, and needs to throw this story out there every couple years so people don't forget he actually did accomplish something at some point in time. Like 20 years ago.
- A.P.
Re:Bill Gates once said... (Score:5, Insightful)
Back to the original topic, I'd point to the idea that sticking children in front of computers somehow magically benefits them.
Paperless office, bah! (Score:5, Insightful)
That computers would bring about the "paperless office".
Not only they didn't, but they made people consume more paper than ever before. On top of all the paper spent, the cost of printing pages increased, as industry made us believe that ink jets were better, and B&W laser passee.
For more discussion see an article in Newsday about it [newsday.com]. There's even a full book dedicated to the question of why the paperless office never came to be [amazon.com].
Re:Al Gore (Score:2, Insightful)
computers in the classroom (Score:5, Insightful)
I've been into computers for quite some time, and am enrolled in Computer Science at university. It's been obvious to me for years that computers in the classroom are a waste of time, energy, and resources for everyone involved.
I try to tell people this, and they wonder why I say that, given my experience with computers. No doubt it's because the people making the decisions have no clue.
Most adults on
Of course, we did have computers at school. Good ol' ICONs, and IBM 8086s. We had typing class a couple times a week, and learned to use a word processor, which is about as far as it needs to go. Leave computers for their own courses in high school (Computer Science and maybe some kind of class for basics.)
Is it not obvious that more harm is being done than good, when it comes to computers in class? There are just so many things wrong with the whole idea. Perhaps one day when computers become more appliance-like, they'll be more beneficial in class, and will be put to use in such a fashion as to not create dependancies.
What do you think?
Re:Al Gore (Score:5, Insightful)
Links:
http://www.firstmonday.dk/issues/issue5
http://www.firstmonday.dk/issues/issue5_10/wig
SAP (Score:5, Insightful)
"In the year 2000..." (Score:4, Insightful)
(proof that fear is the best marketing tool)
Power versus utility (Score:5, Insightful)
True, right now, the yearly, 'we'll-be-helpless-without-faster-computers!' cycle appears to have stopped or slowed down. Big IT buyers seem to have realized that you don't need a machine that could run a weather model to replace a typewriter and that's a real good thing.
But what about software? I could be wrong. I don't do that much with my computer except surfing and writing, but much of what I see makes me wonder where all the really miraculous power of my computer is going.
I've got an operating system that takes up non-trivial space on my harddrive and aside from a constant need to keep up with the virus writers, or dealing with stuff to make Microsoft happy, I'm not seeing the bennies.
You'd think that with all this godawful power, there'd be a little more substance.
The Y2k lie... (Score:2, Insightful)
Things like this were said by MANY computer industry "experts" before Y2k.
There are a lot of people that work very hard to make computers exchange information. It doesn't just happen.
MHz Myth (Score:5, Insightful)
Thankfully, your link debunks it too. (Score:5, Insightful)
Personally, while I may dislike the man, I'm tired of hearing the same tired, stupid jokes repeated over and over again.
- A.P.
Re:one button (Score:2, Insightful)
Apple might have... (Score:3, Insightful)
Re:Bill Gates once said... (Score:2, Insightful)
And no matter how the 'witty' people who post the 'Al Gore invented the internet' posts try to spin it, Al Gore never said anything even close to implying that he invented the internet.
Honestly, I think people who post that are just making a joke now, although its one of the most worn out jokes ever and not really very funny, because of the deliberate misunderstanding that caused it to arise.
It was clear from the context that he meant that he took initiative in supporting the development of the internet. Of course, political opponents of a politician(regardless of what side of the aisle they are on) will always latch onto interpretations that are make what the other guy said look more ridiculous than what was really meant, even when the actual meaning was obvious(not always the same as literal).
Then who did say it? (Score:4, Insightful)
If he didn't say it then who did? And how did the quote get attributed to him?
Or who wrote the original article attributing this to Gates.
Currently, AFAICT, there is only Gates' comment that he didin't say anything that moronic as "proff" that he never made the quote in the first place.
Hardly a compelling rebuttal.
Steve Jobs on networking (Score:0, Insightful)
Now, of course, the Mac has no perifs and can only exist by connecting to a network unless you plug in external devices...
Re:Al Gore (Score:2, Insightful)
Absolutely, and it's false. But myth has it that he claimed to have _invented_ the Internet (implying technical creation, not legislative creation) which is not an accurate chracterization of what he said.
Re:Then who did say it? (Score:2, Insightful)
Disagree (Score:5, Insightful)
I think the problem is that computers aren't being used in their strengths: As long as you use computers as fancy notepads and chalkboards, computers are useless in a classroom.
However, if you cater to their strengths and capabilities, I think computers are invaluable:
1) Their ability to network and connect classrooms with other locations, such as other classrooms, servers with data such as photographs, maps, and things you can't store in a classroom.
2) Their ability to virtualize. See things you can't afford to go see, do things you can't afford to go do, teach things you can't afford to otherwise teach! Books, encyclopedias, and videos offer a very static virtual representation, where a computer can be interactive! Not only can you 'see' different animals at various depths of the ocean with a computer (which a video can do just as well), you can *explore* too! Find out what happens at various pressures to your ship, to your body, see how snowflakes form, how ants find food; and then fiddle with a few settings, and see *different* snowflakes, see the ants starve, and see your ship crumple! You can design airplanes, and see if they fly or fall, you can create space stations, and see if your astronauts starve, overheat, or get bored to death!
3) Interactivity. Very tied to virtualization and networking, you can interact with a computer in a way that you cannot with a video or a book. You can change things, simulate things, watch things, and then go back and change more things. You can have a classroom that happens to have access to a freshwater lake do experiments and research, connected to a classroom that happens to have a database, some programming kids, and a good grasp of math, and at the end of each day each classroom can learn things that before networking neither could!
4) Data manipulation and storage. You can store lots of photographs, keep tremendous databases, perform tedious analysis, and create pictures out of raw numbers that a child, or even an adult, cannot. Measure the temperature, humidity, rainfall, pressure, cloud cover/sunlight, and wind at 400 locations 10 times a day across a city, and have the kids create programs to access, correlate, and manipulate that data and see if they can spot trends, correlations, and causations!
So yes, there are reasons to have computers in the classroom. No, right now no one does it properly.
Re:Bill Gates once said... (Score:2, Insightful)
Hmm, according to Snopes [snopes.com], you are both right and wrong.
I say that because Snopes classifies it as "False" but the explanation itself seems to be a spin. The quote itself on Snopes was:
Snopes then goes on to say that it is rediculous to believe that Al Gore believed he created the Internet. That's not in question. No-one believes Al Gore created the Internet and I doubt Al Gore himself believes it, but the fact of the matter is he said: "During my service in the United States Congress, I took the initiative in creating the Internet."
Perhaps, as Snopes concludes, it was simply a clumsy and self-serving phrasing that Gore used, but he did say it. I figure at worst it was self-serving and at best it was just stupid on Gore's part to say whatever it is he meant to say in that manner. But to say that others are spinning what Gore said is inaccurate. Many people jokingly mention it but, in the end, Gore DID say it in the above context--regardless of how much you wish he hadn't.
my list: (Score:3, Insightful)
2) Blogs will amount to nothing.
3) The MHz Myth
Linux isn't ready for the desktop (Score:3, Insightful)
-Tom
Re:computers in the classroom (Score:2, Insightful)
I think you need to put forward at least a reason if you want anybody to listen.
Re:Apple is dying... (Score:4, Insightful)
Re:640K--not true (Score:5, Insightful)
My point is that Bill Gates is denying it. Bill Gates also says that Microsoft is not a monopoly. Bill Gates saying something does not necessarily make it true.
$.02
Re:Machrone's Law (Score:4, Insightful)
Another one: The things you want to download always require leaving your computer downloading overnight. In 1995 I had to leave my 14.4kbps modem running overnight to get MP3s, and now DVD-R images take about the same amount of time.
Re:Bill Gates once said... (Score:3, Insightful)
So what?.. (Score:2, Insightful)
Regardless, I nominate Dell for building a 640MB limit into their X200 laptops. They'll take two memroy chips, and one can even be a 512MB chip. But the system maxes out at 640MB.
But that's OK. It makes it easier for me to push the less expensive but slightly larger Latitudes for the engineers - who *always* want more memory. Not that I blame them.
Re:Bill Gates once said... (Score:2, Insightful)
Re:DAMN IT. (Score:5, Insightful)
Yes, as a normal, sane person, I understand it: he is 100% correct.
Befor the Congress pushed for it's opening to the world, there was no such thing known as the 'Internet'; there was a closed network of universities and military computers (ever wondered what DARPA means?).
He, as a congressman, was one of the main players in opening that network to the world, so he played a very important role (if not the most important) in the creation of the 'Internet'.
It seems to me that the un-normal, un-sane person in this thread is, you.
Re:Try using actual facts next time (Score:4, Insightful)
Apple PAID for the rights to the stuff from Xerox. The facts are covered in numerous places if you'd like to trouble yourself to get a clue about this incident in computer history.
Much of the rest of what you have to say is too self-contradictory to be worth responding to.
Re:Apple might have... (Score:3, Insightful)
Duh. What, do you think every other company exists in a little bubble where they survive soley on the work of internal employees?
Justin Dubs
Re:Bill Gates once said... (Score:0, Insightful)
I can't believe I haven't seen this one yet... (Score:5, Insightful)
The much more interesting question is... (Score:1, Insightful)
to demonstrate these assumptions are wrong?
I've noticed that in the examples cited above, they are mostly assumptions
based on numerical limits (100mhz, 1GB and 5). The question then becomes
what was it that allowed us to workaround these "hard" limits?
-cmh
Re:Bill Gates once said... (Score:2, Insightful)
Are you a Republican?
Re:If you ask Ray Kurzweil he might say (Score:5, Insightful)
It's an interesting intellectual exercise, but the idea that we are merely computers is nothing more than the continued novelty of the computer, just as we once thought of ourselves and the world as clockwork. Wishful thinking, or perhaps professional myopia. Everyone thinks their field is the key to the universe. But this is not theory, so until someone can actually create complex life, I see no reason to believe people like Ray. Show me the money.
Re:Bill Gates once said... (Score:1, Insightful)
"Do not murder people," is probably the most accurate modern English translation. Various people who have only heard it as, "Thou shalt not kill," have often used it to trumpet that the Jewis/Christian/Islamic God is opposed to war, the death penalty, abortion, shooting intruders, or eating meat.
The anti-abortionists might have a case, if they can demonstrate that a fetis is a person to the satisfaction of all concerned, but the rest are definitely examples of misguided people whose justification for their position is based on a common misquote (or rather, a mistranslation.)
Re:Apple is dying... (Score:1, Insightful)
Oh wait, you were giving an example of an incorrect assumption. I misunderstood.
Moore's Law isn't the limit...power consumption is (Score:3, Insightful)
Moore's law *will* continue, but the advances need to come from a different direction than the one we've been following. It's already hitting the point where you just don't *want* a high-end processor in your laptop, because you have to keep it running much slower anyway just to get some acceptable battery life. The 3.4GHz Prescott is arguably something you don't want in your *desktop* as it is.
Bottom line: Moore's law is no longer the most important concern in computing technology.
Re:my favorites (Score:4, Insightful)
I also liked that business progression predictions:
1) people would migrate from MF (never did)
2) people would not move from Netware (80% market share) to NT (did)
3) home users will not use Lunix (dooh!)
4) Now we have alongside Apple is dead, Sun is dead, and others.
5) I remember in 97-98 we all thought 64 bit would be mainstream by 99-00. Boy did that not happen. Save Nintendo I guess.
Oh yeah, what about distributed computing... DCOM will be, Web services will be,
Surely nobody ever predicted that computer technology would head straight for whatever is the slowest way:
Why program sockets when HTTP is 100 times slower?
Why program to a relational database or object system when XML text is 100s of times slower?
Why compile when we can interpret?
Why run software on your cmoputer when you can connect to a terminal, web server, or host and do a 100 times less?
Why not create about 20 layers between the application and the video card?
Why hire experienced programmers when you can hire some with no experience 1/2 across the world and get the project done 100 times slower?
Oh yeah, and then there's commodity computers. Everyone predicted that in the early '90's but the corp.s have successfully kept the prices high. Of course, with inflation we are starting to approach commodity computers.
Finally the one about re-usable objects. Maybe sourceforge and open source projects like Apache are as close as we can get. In 94 I remember everyone figures there'd be online libraries where one could download whatever component was needed. Hah!
Re:My Personal Favorite... (Score:3, Insightful)
But more importantly if you read the article carefully (which is a lot I know, this is
Intel pays researchers lots of money to think outside of the box, but they need to write papers like this to keep the establishment continuing to realize them for the miracle workers that they are.
Y2K (Score:2, Insightful)
Linux is UNHACKABLE!!!! (Score:1, Insightful)
one word: Itanium (Score:3, Insightful)
AMD couldn't have hoped for a better present from it's greatest rival. They have started building a new factory in Dresden (Fab 36) in anticipation of the increased demand of Opterons and Athlon64s.
The desktops we will be buying in 2005 (2004?) will be 64-bit and it seams they won't be "intel inside".
Re:the list (Score:4, Insightful)
I stand in awe.
Re:If you ask Ray Kurzweil he might say (Score:1, Insightful)
Think about a big modern building - it has a plumbing and electrical system to bring in the "nutrients" it needs to operate and a sewer and HVAC system to exhaust waste products. These days we even have telecommunications systems, fire and security systems, etc. You can certainly make an analogy to the circulatory system and the nervous system in the human body.
But what about computers? Is there some analogy to be made to systems in the human body? Maybe you should read this:
http://americanhistory.si.edu/csr/comphist/mont
So, if we can say modern structures are a crude attempt to build in our own image at the basic plumbing level - so maybe computers are a similar attempt to build in our own image at the cellular level. And what is going on at the high end in modern computing systems? Massively parallel supercomputers for one! So you can make an analogy of many processors working in parallel to the cells in the human body doing much the same thing.
Anyway - something to think about... I certainly find myself in the "humans are just fancy biological machines" camp
PacMan (Score:2, Insightful)
- Kristin Wilson, Nintendo, Inc., 1989.
4GL and "programmerless programming" (Score:3, Insightful)
-- In five years, everybody will be using fourth-generation languages (our 4GL, etc) for everything except the lowest level of hardware support.
-- You wont need programmers at all if you use our programmerless rule development interface. (See, NetExpert 8-)
Basically, the any-idiot "enabling" technologies that were supposed to do away with all forms of having to know how a computer works.
[Includes "death" of C and C++, Java, Perl, etc in favor of Power-Builder-esque symbolic/graphical program construction systems.]
yea, sure... 8-)
Re:"In the year 2000..." (Score:2, Insightful)
We are the Enemy! (Score:2, Insightful)
The fact your browser takes 33MB to run is a problem. Every software wants to include the kitchen sink, that's a problem.
Maybe if we have less abstraction layers, less dynamic invocations, less runtime discovery, and more focus on building something that works, we really would not need 4GB of RAM. Maybe, just maybe, the programs will run faster as well.
Re:Multitasking (Score:2, Insightful)
Many of us ran Windows on 8088 and '286 machines for quite awhile before we could afford a machine with the '386 'virtual 8086' functionality, so we didn't have 'Pre-emptive mutlitasking' either.
And I know that Microsoft made no unfounded claims about 'multitasking' on their early versions of Windows.
What's your basis for your claims?
Re:"In the year 2000..." (Score:2, Insightful)
I work in the power industry, and this attitude really pisses off some of my coworkers who spent thousands of man-hours remediating software and firmware systems one by one.
Yes... Y2K did feature a lot of hype, but the response to the hype saved our ass. Engineers, managers, developers, even politicans... the human race came together on this one.
Re:640K--not true (Score:4, Insightful)
Counting one as prime would mean there was no longer a unique prime factorization, because you could add one as many times as you liked to the list of prime factors.
At the moment, the prime factorization of 12 is 3*2*2. If you allow one, it could be 3*2*2*1 or 3*2*2*1*1*1*1.
Why this matters, I forget, but apparently if you are a mathematician it does.