The Most Incorrect Assumptions In Computing? 1496
Miss Muis writes "After reading once again that Moore's Law will become obsolete, I amused myself thinking back to all the predictions, absolutes and impossibles in computing that have been surpassed with ease. In the late 80s I remember it being a well regarded popular 'fact' that 100MHz was the absolute limit for the speed of a CPU. Not too many years later I remember much discussion about hard drives for personal computers being physically unable to go much higher than 1GB. Let's not forget "I think there is a world market for maybe five computers" from the chairman of IBM in 1943, and of course 'Apple is dying...' (for the past 25 years). What are your favorite beliefs-turned-on-their-heads in the history of computing?"
Re:Bill Gates once said... (Score:2, Informative)
-Bill Gates, 1981
Re:Bill Gates once said... (Score:3, Informative)
Storage Space (Score:5, Informative)
download (Score:4, Informative)
Download (actual) - definition of the transfer of data from an network to your machine.
Uses:
1. "I downloaded the software from the CD to my computer."
2. "I downloaded the file from the internet."
3. "I downloaded the file into my e-mail and sent it to him."
Only #2 is correct.
I had to berate my father for WEEKS before he learned the intricacies of Download vs. Upload vs. Install.
Chris
Ars Technica: Ultimate Limits of Computers (Score:5, Informative)
Entitled "The Ultimate Limits of Computers," it deals with issues including not only Moore's law, but quantum mechanics... such as Plank's constant, Boltzmann's constant, the gravitational constang, the application of quantum mechanics to thermodynamics, and other interesting things that I barely (read: don't) understand.
Re:my personal favorite (Score:3, Informative)
Remember "Bob" from Microsoft? The predecessor to "Clippy"?
Re:Bill Gates once said... (Score:3, Informative)
Having said that, in 1981, 640kB technically _was_ enough for most people.
I hereby nominate "640k ought to be enough for anybody" as most misquoted phrase ever.
Re:Al Gore (Score:3, Informative)
Status: False.
Origins: No,
Al Gore did not claim he "invented" the Internet, nor did he say anything that could reasonably be interpreted that way. The derisive "Al Gore said he 'invented' the Internet" put-downs are misleading distortions of something he said (taken out of context) during an interview with Wolf Blitzer on CNN's "Late Edition" program on 9 March 1999. When asked to describe what distinguished him from his challenger for the Democratic presidential nomination, Senator Bill Bradley of New Jersey, Gore replied (in part):
During my service in the United States Congress, I took the initiative in creating the Internet. I took the initiative in moving forward a whole range of initiatives that have proven to be important to our country's economic growth and environmental protection, improvements in our educational system.
Clearly, although Gore's phrasing was clumsy (and self-serving), he was not claiming that he "invented" the Internet (in the sense of having designed or implemented it), but that he was responsible for helping to create I also invented the microphone the environment (in an economic and legislative sense) that fostered the development of the Internet. Al Gore might not know nearly as much about the Internet and other technologies as his image would have us believe, and he certainly has been guilty of stretching (if not outright breaking) the truth before, but to believe that Gore seriously thought he could take credit for the "invention" of the Internet -- in the sense offered by the media -- is just silly. (To those who say the words "create" and "invent" mean the same thing: If they mean the same thing, then why have the media overwhelmingly and consistently cited Gore as having claimed he "invented" the Internet when he never used that word? The answer is that the words don't mean the same thing, but by substituting one word for the other, commentators can make Gore's claim sound [more] ridiculous.)
However, validating even the lesser claim Gore intended to make is problematic. Any statement about the "creation" or "beginning" of the Internet is difficult to evaluate, because the Internet is not a homogenous entity (it's a collection of computers, networks, protocols, standards, and application programs), nor did it all spring into being at once (the components that comprise the Internet were developed in various places at different times and are continuously being modified, improved, and expanded). Despite a spirited defense of Gore's claim by Vint Cerf (often referred to as the "father of the Internet") in which he stated "that as a Senator and now as Vice President, Gore has made it a point to be as well-informed as possible on technology and issues that surround it," many of the components of today's Internet came into being well before Gore's first term in Congress began in 1977, and it's hard to find any specific action of Gore's (such as his sponsoring a Congressional bill or championing a particular piece of legislation) that one could claim helped bring the Internet into being, much less validate Gore's statement of having taken the "initiative in creating the Internet."
It's true that Gore was popularizing the term "information superhighway" in the early 1990s (when few people outside academia or the computer/defense industries had heard of the Internet) and has introduced a few bills dealing with education and the Internet, but even though Congressman, Senator, and Vice-President Gore may always have been interested in and well-informed about information technology issues, that's a far cry from having taken an active, vital leadership role in bringing about those technologies. Even if Al Gore had never entered the political arena, we'd probably still be reading web pages via the Internet today.
Last updated: 27 September 2000
The URL for this page is http://www.snopes.com/quotes/internet.htm
Re:There have been some real humdingers... (Score:2, Informative)
Re:The truth might be out there (Score:5, Informative)
640K--not true (Score:4, Informative)
Re:I Invented... (Score:5, Informative)
Full Snopes debunking of mythical quote (Score:5, Informative)
Re:download (Score:4, Informative)
But if you're downloading data from a site, the site is not also uploading that data to you. The action exists at only one end of the operation, at the initiator of the action.
The location can be virtual (i.e. using the local machine to log into a remote machine to have the remote machine upload a file to the local machine is uploading, not downloading).
However, FTP has a (rarely implemented) feature where the controller of the transfer is neither sender nor receiver. One can trasnfer files from one host to another while controlling it from a third, and the data doesn't even pass through the third machine. Is that neither uploading nor downloading, or both? IMO, it is simply transferring.
Re:Al Gore (Score:5, Informative)
According to Vint Cerf (Widely known as one of the "Fathers of the Internet," Cerf is the co-designer of the TCP/IP protocols and the architecture of the Internet. )
"VP Gore was the first or surely among the first of the members of Congress to become a strong supporter of advanced networking while he served as Senator. As far back as 1986, he was holding hearings on this subject (supercomputing, fiber networks...) and asking about their promise and what could be done to realize them. Bob Kahn, with whom I worked to develop the Internet design in 1973, participated in several hearings held by then-Senator Gore and I recall that Bob introduced the term ``information infrastructure'' in one hearing in 1986. It was clear that as a Senator and now as Vice President, Gore has made it a point to be as well-informed as possible on technology and issues that surround it. As Senator, VP Gore was highly supportive of the research community's efforts to explore new networking capabilities and to extend access to supercomputers by way of NSFNET and its successors, the High Performance Computing and Communication program (which included the National Research and Education Network initiative), and as Vice President, he has been very responsive to recommendations made, for example, by the President's Information Technology Advisory Committee that endorsed additional research funding for next generation fundamental research in software and related topics. If you look at the last 30-35 years of network development, you'll find many people who have made major contributions without which the Internet would not be the vibrant, growing and exciting thing it is today. The creation of a new information infrastructure requires the willing efforts of thousands if not millions of participants and we've seen leadership from many quarters, all of it needed, to move the Internet towards increased availability and utility around the world. While it is not accurate to say that VP Gore invented Internet, he has played a powerful role in policy terms that has supported its continued growth and application, for which we should be thankful. We're fortunate to have senior level members of Congress and the Administration who embrace new technology and have the vision to see how it can be put to work for national and global benefit. "
32 bit IP addresses out be sufficient :) (Score:3, Informative)
"2. Expanded Address field. The address field will be expanded to 32 bits..." "This expansion is adequate for any forseeable ARPA Network growth"
http://www.faqs.org/rfcs/rfc704.html
Re:Apple is dying... (Score:2, Informative)
The REAL legacy of Microsoft Bob: (Score:5, Informative)
The REAL legacy of Microsoft Bob [pmt.org], from Wikipedia [wikipedia.org]:
Microsoft Bob was a project managed by Melinda French, who later married Bill Gates [wikipedia.org] to become Melinda Gates [wikipedia.org].
Re:9600 baud (Score:5, Informative)
Re:download (Score:3, Informative)
Operating system death is a myth (Score:3, Informative)
Re:Overrated. (Score:1, Informative)
Deus Ex 2 an Prince of Persia: Sands of time are two I can think of right away. If you've got a GF2 or an older Radeon card, it just won't work.
Re:Bill Gates once said... (Score:4, Informative)
Re:One year from now... (Score:3, Informative)
Politians NEVER do this.... righhht. (Score:5, Informative)
He fathered the bill that changed that odd, government and acedemic research network known as Arpanet into the Internet where people from all around can use it for all different sorts of purposes.
So if he wrote the bill, does that not mean he didn't take initiative in creating the Internet? Would it not be unreasonable for him to bring up this fact while he was campaigning and trying to get people to see "Hey, look what I did!"?
So please, get with it and stop political trolling. Thanks!
Re:Bill Gates once said... (Score:2, Informative)
Where are you going with this?
I think that's where he was going - there's a difference between "creating the internet" and "facilitating an internet economy" (which is what I wish Gore had claimed to have done).
Re:A little Googling and: (Score:2, Informative)
And he was right. At the time, OS/2 and NT were the same OS. MS and IBM were working on it together. After the falling out, they forked. IBM kept the OS/2 name, MS kept the NT name. NT became Windows 2000, then XP.
Amazing how much a name means to people. I'm sure most people think Windows 2000 was the next version of Windows 98.
Re:download (Score:3, Informative)
Re:A little Googling and: (Score:3, Informative)
-- Bill Gates, from "OS/2 Programmer's Guide" (forward by Bill Gates)
I don't think it's that hard to believe Bill Gates thought OS/2 would be destined to be the most important OS of all time. OS/2 sure gave Microsoft a whole lot of free IBM research and development when they backstabbed IBM and launched Windows 95. OS/2 was ahead of its time, in terms of the technology and capabilities. In a lot of ways the hardware just wasn't there yet. But OS/2 certainly created an incredible opportunity for Microsoft.
Re:Bill Gates once said... (Score:3, Informative)
You obviously haven't bought memory from IBM or Dell for one of their servers lately. They are very proud of their 1G or larger ECC/Registered DIMMs.
Re:640K--not true (Score:2, Informative)
((2^n)-1) is a fun equation.
Re:My favorite... (Score:4, Informative)
But then he predicted it again in 2000.
- A.P.
Re:Bill Gates once said... (Score:5, Informative)
Most people preferred to spend $2000 on a PC with a 16-bit address space rather than $10000 on a PDP-11 with a 22-bit address space.
I think that 20 address bits were plenty for 1981. The real problem was that there was no upgrade path for about 10 years after that. The Intel 8086 was 20 bits, fine. The Intel 80186 was 20 bits, okay. The Intel 80286 had "protected mode" addressing to increase the addres space, but it was nearly impossible for an operating system to context switch between "protected mode" and "real mode" (there was no instruction to do it, so an OS had to actually REBOOT THE PROCESSOR and then recover all its state on the fly).
So until the 80386 came out, there was no way to get a new system with both (a) support for old programs and (b) support for more address space. And during that 10-year dry spell, that's when all those extendad / expanded memory schemes came out, and that's when the 1 megabyte limit really hurt.
Side note on Al Gore's "well-informed"ness. (Score:5, Informative)
Your copy of the Snopes article is not what they posted. Anyone who actually read what you posted would have noted this glaring discontinuity.
I can appreciate the clarification on Gore's "inventing" the Internet. But I think Gore gets too high a mark here and I'd like to point out why I think so as a side note to a comment I read in Snopes' essay.
Snopes cites Vince Cerf saying "that as a Senator and now as Vice President, Gore has made it a point to be as well-informed as possible on technology and issues that surround it" but by 1999 (the copyright date on the Cerf page Snopes cites), Clinton/Gore had brought us the 1996 Telecommunications Act (which was a big step toward the media deregulation many groups across a wide political spectrum rail against today), the Digital Millennium Copyright Act [wikipedia.org], and the 1998 Copyright Term Extension Act [wikipedia.org]. So I come away thinking that Al Gore's legislative history deserves a more mixed review than Cerf (and Snopes) describe.
Re:download (Score:2, Informative)
That definition of course breaks down when the machines are equal, such as transfers between two XServes or two Powerbooks, as well as when the two machines are one in the same, such as downloading a web page hosted on the same machine that requested it.
Even when FTP'ing to 127.0.0.1, get is still downloading and put is still uploading. As it would also be if you FTP'd from a mainframe to a laptop.
Re:The folks at HP said... (Score:2, Informative)
This does not mean that some one at HP never said people wouldn't want to have a mouse.
Clicky (Score:4, Informative)
Re:640K--not true (Score:3, Informative)
1 isn't a prime [utah.edu]. :-)
Re:640K--not true (Score:3, Informative)
Re:Bill Gates once said... (Score:1, Informative)
Re:My favorite lie (Score:5, Informative)
My, how things have changed.
There are so many applications that do everything I needed to do on Windows, now. So you can't live without Kazaa? Download Apollon, the KDE FastTrack client. Need word processing? AbiWord/KWord are excellent pieces of work. Outlook got you down? Ximian Evolution has everything you need. Instant messaging? Gaim/Kopete. Music playing? XMMS/JuK will replace Winamp/Foobar quite handily. Graphics? The Gimp. Video/DVD playback? Xine tackles everything I throw at it. Development? KDevelop/xemacs. Web work? Quanta Plus/Bluefish. CD recording? K3b is every bit as good as Nero and is free. Web browsing? Konqueror/Mozilla/Firebird/Galeon/Epiphany. Usenet? Pan kills every similar offering on Windows.
Additionally, KDE supplies me with various features that Windows can't match. I want to save an image from a website directly to my webspace, via either FTP or WebDAV? Right-click, "Save As," click "FTP" and Save.
In addition, I paid $0 for all of the software on my computer, have ready access to the source code if I'd like to add a feature, and am not raped by vendor lock-in. I also am not subject to the ~30 holes in Internet Explorer this year, or worms like Blaster, Slammer or Welchia.
There are really only a handful of things Linux isn't better at right now, and those are very, very steadily improving. The first and most obvious would be gaming, and even though older games like Starcraft and Diablo 2 run very well under Wine, newer games like Unreal Tournament 2003 are being released natively for Linux, there's still nowhere near the selection. I concede that; it's all about choosing the right tool for the job. The second is video editing, which really isn't very good on PC either with the sole exception of Adobe Premiere. I don't touch either of these things often, so it's not a tremendous deal for me.
I wouldn't say it's good enough for Joe User right now, though. Package management and software installation still needs to be simplified for the average user (.deb should be the de facto standard, IMHO). Installation could be less painful if you don't know what you're doing. GTK+ needs a better file selector, admittedly, though I hold the opinion that GTK+ is generally worse than Qt to begin with, so I don't have trouble finding Qt-based replacements.
My older brother, who has barely touched a computer in his life, can work at my KDE setup with ease. I consider this a small victory.
Microsoft Consulted on 640k (Score:3, Informative)
BG: Microsoft was playing a much broader role[laughs] than just doing software for this machine. I mean whether it is the keyboard, the character set, the graphics adapter, or even the memory layouts. I laid out memory so the bottom 640K was general purpose RAM and the upper 384 I reserved for video and ROM, and things like that. That is why they talk about the 640K limit. It is actually a limit, not of the software, in any way, shape, or form, it is the limit of the microprocessor. That thing generates addresses, 20-bits addresses, that only can address a megabyte of memory. And, therefore, all the applications are tied to that limit. It was ten times what we had before. But to my surprise, we ran out of that address base for applications within -- oh five or six years people were complaining.
(emphasis added for clarity)
Occasionally, I do RTFA in advance of posting!
Re:Bill Gates once said... (Score:2, Informative)
I believe the Motorola 68000 CPU came out in like 1979, and it featured 24 address lines and 32-bit address registers, so was thusly ready for 16MB of RAM out of the box, and was transparently extendable to 4GB, with the addition of 8 more address lines on the 68020.
The only downside that was, at least in the Amiga community, that some programmers who fancied themselves clever, used the upper, unused, 8 bits of the address registers for flags, and thus their programs died horribly on 68020s, which could actually physically connect to the full 32-bit address range.
The 68K was a fine chip, with linear address space, and 8 general purpose data registers, and 7 general purpose address registers (plus one special purpose address register). It's such a shame we ended up with that kludgy intel beast. Sort of funny to watch a P4 or Athlon XP chip run MSDOS 5.0 with no emulation, though. ;p
The growth of Internet (Score:3, Informative)
Well, it turns out that this dot-com myth is somewhat wrong and the growth is not so much stronger than radio and TV. [ifi.uio.no]
Very interesting stuff, bumped into it on Usenet.
Re:Great Heinlein-ism (Score:2, Informative)
(See also here [lsi.usp.br] if you want to know what Clarke meant by 'elderly'.)
Proof that Moore's Law will come to an end (Score:3, Informative)
Moore's Law can't continue to hold out, period. That's because Moore's Law refers to silicon transistors, and you can't make a transistor with a feature that is less than one silicon atom thick.
Intel and IBM both have demonstrated 65 nm experimental processes, though for volume production, 130 nm is the current state of the art. There are only eight more doublings left until the line width is less than the diameter of an atom (the diameter of a silicon atom is about a third of a nanometer). One doubling every two years means it's all over in 16.
Now, we could possibly continue to increase circuit density for a long time after that by going to 3-D, but we would no longer be in the domain of Moore's Law: we'd be adding more transistors but they wouldn't be getting any smaller.
Re:I Invented... (Score:2, Informative)
"But it will emerge from my dialogue with the American people. I've traveled to every part of this country during the last six years. During my service in the United States Congress, I took the initiative in creating the Internet. I took the initiative in moving forward a whole range of initiatives that have proven to be important to our country's economic growth and environmental protection, improvements in our educational system. "
Re:If you ask Ray Kurzweil he might say (Score:4, Informative)
We are no more computers than we were when Descartes conjectured that a thinking machine could be constructed by hydraulics, so impressed were the folk of the day with the achievments of that technology!
Re:640K--not true (Score:3, Informative)
However, 1 might be called a trivial prime, since it indeed only has factors of 1 and itself, also 1.
Roger Penrose Might Say (Score:3, Informative)
I'll point out here that I know that some of his arguments aren't watertight and the discussion is definitely in progress -- he knows this, as is evidenced by quotes like this from the article: "With apparently genuine humility, Penrose emphasizes that these ideas should not be called theories yet: be prefers the word 'suggestions.'" But they're as well supported as any other speculations about the nature of consciousness.
Re:Proof that Moore's Law will come to an end (Score:4, Informative)
The original estimate was off since the minimum feature size is not the same as the smallest dimension in the transistor gate. The limit is actually set by the gate width which can't go less than about 5 nm without the probability of quantum tunnelling occuring going above an acceptable limit (some leakage is OK but there is a point where it is not possible to distinguish the on state from the off...
We have two more halvings of the minimum feature size before production silicon reaches state of the art and two more halvings after that before the show is all over folks.
The doubling density each year rule corresponds to a halving of minimum feature size each 2 years. Intel say this is extending to 3 years (Bush recession, war on terror etc.) and the show is over completely by about 2015 at the latest.
There will be cleanup work for some time but if you think about what you end up with at the end point it is quite interesting. You can fit 64 times the current number of transistors onto a chip. So you can comfortably fit your current state of the art Quad-Pentium processor with 4Gb of RAM all onto the same chip. So cut away the external memory bus completely and in its place add a couple of laser diodes on each edge and some receivers and you have a processing unit that communicates to its peripherals and any neighbors by means of optical couplings. It does not need special cooling either because only a small part of the chip area is CPU, the rest is memory.
Instead of adding memory to this type of system you add more processor/memory units. You cound easily fit four, sixteen into a comodity PC box. The optical couplings mean that memory paging etc can be handled by making a request to a neighboring processor that stores that information.
Big supercomputers are built by simply lashing a few hundred standard boards together.
Back to the future, I was building these systems 20 years ago. Only then we called them transputers and they only had 4Kb of RAM
Re:My Mac Friend... (Score:1, Informative)
Whatever the reason of his 404, comparing processor speed with network bandwith is a nonsense, bandwith-wise however the G5 can handle 6.4GBs, the hypertransport bus it runs on has a 12,8GBs bandwith so his G5 does handle more data per second than his broadband connection.
Anyway, even then you can't compare the speed of your processor with your network bandwith since they simply are different things and do different stuff.
And the gag about a dimwit saying his computer is faster than the internet was told to me by a guy about 6 years ago and it was concerning his win95 friend, maybe he switched recently!
Re:PacMan (Score:2, Informative)
It appears that this "joke" is actually only about three years old, google shows the sig file first appears on Usenet in on December 12, 2000, attributed to 'anon' or 'unknown'.