Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
It's funny.  Laugh. Technology

The Most Incorrect Assumptions In Computing? 1496

Miss Muis writes "After reading once again that Moore's Law will become obsolete, I amused myself thinking back to all the predictions, absolutes and impossibles in computing that have been surpassed with ease. In the late 80s I remember it being a well regarded popular 'fact' that 100MHz was the absolute limit for the speed of a CPU. Not too many years later I remember much discussion about hard drives for personal computers being physically unable to go much higher than 1GB. Let's not forget "I think there is a world market for maybe five computers" from the chairman of IBM in 1943, and of course 'Apple is dying...' (for the past 25 years). What are your favorite beliefs-turned-on-their-heads in the history of computing?"
This discussion has been archived. No new comments can be posted.

The Most Incorrect Assumptions In Computing?

Comments Filter:
  • by Liselle ( 684663 ) * <slashdot@NoSPAm.liselle.net> on Thursday December 04, 2003 @03:45PM (#7631730) Journal
    "640K ought to be enough for anybody."
    -Bill Gates, 1981
  • by zowch ( 552785 ) on Thursday December 04, 2003 @03:48PM (#7631777)
    Gates has actually discounted that rumor several times (one of which can be found here [urbanlegends.com], and I've got to say that it probably *is* untrue, as I really can't imagine anybody ever saying that.
  • Storage Space (Score:5, Informative)

    by kidgenius ( 704962 ) on Thursday December 04, 2003 @03:48PM (#7631778)
    Whenever I get a new harddrive, i invariably say "I'll never be able to fill that up" and somehow within about 2 years time I'm out buying an extra hard drive.
  • download (Score:4, Informative)

    by mrscorpio ( 265337 ) <twoheadedboy.stonepool@com> on Thursday December 04, 2003 @03:48PM (#7631787)
    Download (supposed) - definition of the transfer of data from any source to another.

    Download (actual) - definition of the transfer of data from an network to your machine.

    Uses:

    1. "I downloaded the software from the CD to my computer."

    2. "I downloaded the file from the internet."

    3. "I downloaded the file into my e-mail and sent it to him."

    Only #2 is correct.

    I had to berate my father for WEEKS before he learned the intricacies of Download vs. Upload vs. Install.

    Chris
  • by briglass ( 608949 ) on Thursday December 04, 2003 @03:50PM (#7631833)
    Check out this article from Ars Technica: http://www.arstechnica.com/wankerdesk/01q2/limits/ limits-1.html

    Entitled "The Ultimate Limits of Computers," it deals with issues including not only Moore's law, but quantum mechanics... such as Plank's constant, Boltzmann's constant, the gravitational constang, the application of quantum mechanics to thermodynamics, and other interesting things that I barely (read: don't) understand.
  • by Carnildo ( 712617 ) on Thursday December 04, 2003 @03:50PM (#7631835) Homepage Journal
    people will be thankful to have a anthropomorphic paperclip tell them what to do.

    Remember "Bob" from Microsoft? The predecessor to "Clippy"?
  • by W2k ( 540424 ) on Thursday December 04, 2003 @03:52PM (#7631865) Journal
    It's a known fact that Bill Gates never actually said this, or at best, that it is rephrased severely and taken out of context.

    Having said that, in 1981, 640kB technically _was_ enough for most people.

    I hereby nominate "640k ought to be enough for anybody" as most misquoted phrase ever.
  • Re:Al Gore (Score:3, Informative)

    by Anonymous Coward on Thursday December 04, 2003 @03:53PM (#7631896)
    Claim: Vice-President Al Gore claimed that he "invented" the Internet.

    Status: False.

    Origins: No,
    Al Gore did not claim he "invented" the Internet, nor did he say anything that could reasonably be interpreted that way. The derisive "Al Gore said he 'invented' the Internet" put-downs are misleading distortions of something he said (taken out of context) during an interview with Wolf Blitzer on CNN's "Late Edition" program on 9 March 1999. When asked to describe what distinguished him from his challenger for the Democratic presidential nomination, Senator Bill Bradley of New Jersey, Gore replied (in part):

    During my service in the United States Congress, I took the initiative in creating the Internet. I took the initiative in moving forward a whole range of initiatives that have proven to be important to our country's economic growth and environmental protection, improvements in our educational system.

    Clearly, although Gore's phrasing was clumsy (and self-serving), he was not claiming that he "invented" the Internet (in the sense of having designed or implemented it), but that he was responsible for helping to create I also invented the microphone the environment (in an economic and legislative sense) that fostered the development of the Internet. Al Gore might not know nearly as much about the Internet and other technologies as his image would have us believe, and he certainly has been guilty of stretching (if not outright breaking) the truth before, but to believe that Gore seriously thought he could take credit for the "invention" of the Internet -- in the sense offered by the media -- is just silly. (To those who say the words "create" and "invent" mean the same thing: If they mean the same thing, then why have the media overwhelmingly and consistently cited Gore as having claimed he "invented" the Internet when he never used that word? The answer is that the words don't mean the same thing, but by substituting one word for the other, commentators can make Gore's claim sound [more] ridiculous.)

    However, validating even the lesser claim Gore intended to make is problematic. Any statement about the "creation" or "beginning" of the Internet is difficult to evaluate, because the Internet is not a homogenous entity (it's a collection of computers, networks, protocols, standards, and application programs), nor did it all spring into being at once (the components that comprise the Internet were developed in various places at different times and are continuously being modified, improved, and expanded). Despite a spirited defense of Gore's claim by Vint Cerf (often referred to as the "father of the Internet") in which he stated "that as a Senator and now as Vice President, Gore has made it a point to be as well-informed as possible on technology and issues that surround it," many of the components of today's Internet came into being well before Gore's first term in Congress began in 1977, and it's hard to find any specific action of Gore's (such as his sponsoring a Congressional bill or championing a particular piece of legislation) that one could claim helped bring the Internet into being, much less validate Gore's statement of having taken the "initiative in creating the Internet."

    It's true that Gore was popularizing the term "information superhighway" in the early 1990s (when few people outside academia or the computer/defense industries had heard of the Internet) and has introduced a few bills dealing with education and the Internet, but even though Congressman, Senator, and Vice-President Gore may always have been interested in and well-informed about information technology issues, that's a far cry from having taken an active, vital leadership role in bringing about those technologies. Even if Al Gore had never entered the political arena, we'd probably still be reading web pages via the Internet today.

    Last updated: 27 September 2000

    The URL for this page is http://www.snopes.com/quotes/internet.htm
  • by Patrik_AKA_RedX ( 624423 ) on Thursday December 04, 2003 @03:55PM (#7631933) Journal
    "Trust me, this is way better than OS/2." - The dude at Computer City that sold me my copy of WIndows 95. Bastard.
    You should have seen that one coming, "trust me" means as much a "I'm going to lie to you now" to a salesperson.
  • 640K--not true (Score:4, Informative)

    by H0NGK0NGPH00EY ( 210370 ) on Thursday December 04, 2003 @04:00PM (#7632028) Homepage
    640K is enough for anyone. (that one was easy)

    ...and also not true [urbanlegends.com].
  • Re:I Invented... (Score:5, Informative)

    by Foofoobar ( 318279 ) on Thursday December 04, 2003 @04:03PM (#7632074)
    Um... sorry. He never said that. He said he helped in the creation of the internet... which he did as some of the key people involved in Darpanet will admit to. He pushed to have Darpanet become publicly available to everyone.
  • by michaelggreer ( 612022 ) on Thursday December 04, 2003 @04:12PM (#7632171)
    As usual, they have the full scoop [snopes.com]. He did indeed take great initiative in creating the internet, but the statement is still awkward and self-serving.
  • Re:download (Score:4, Informative)

    by HTH NE1 ( 675604 ) on Thursday December 04, 2003 @04:19PM (#7632278)
    Upload/download also refers to who is initiating the action. If you're pulling network data to you, you are downloading; if you're pushing network data to someone else, you are uploading.

    But if you're downloading data from a site, the site is not also uploading that data to you. The action exists at only one end of the operation, at the initiator of the action.

    The location can be virtual (i.e. using the local machine to log into a remote machine to have the remote machine upload a file to the local machine is uploading, not downloading).

    However, FTP has a (rarely implemented) feature where the controller of the transfer is neither sender nor receiver. One can trasnfer files from one host to another while controlling it from a third, and the data doesn't even pass through the third machine. Is that neither uploading nor downloading, or both? IMO, it is simply transferring.
  • Re:Al Gore (Score:5, Informative)

    by civilizedINTENSITY ( 45686 ) on Thursday December 04, 2003 @04:19PM (#7632282)
    Gore was popularizing the term "information superhighway" in the early 1990s (when few people outside academia or the computer/defense industries had heard of the Internet) and did introduce a few bills dealing with education and the Internet.

    According to Vint Cerf (Widely known as one of the "Fathers of the Internet," Cerf is the co-designer of the TCP/IP protocols and the architecture of the Internet. ) :
    "VP Gore was the first or surely among the first of the members of Congress to become a strong supporter of advanced networking while he served as Senator. As far back as 1986, he was holding hearings on this subject (supercomputing, fiber networks...) and asking about their promise and what could be done to realize them. Bob Kahn, with whom I worked to develop the Internet design in 1973, participated in several hearings held by then-Senator Gore and I recall that Bob introduced the term ``information infrastructure'' in one hearing in 1986. It was clear that as a Senator and now as Vice President, Gore has made it a point to be as well-informed as possible on technology and issues that surround it. As Senator, VP Gore was highly supportive of the research community's efforts to explore new networking capabilities and to extend access to supercomputers by way of NSFNET and its successors, the High Performance Computing and Communication program (which included the National Research and Education Network initiative), and as Vice President, he has been very responsive to recommendations made, for example, by the President's Information Technology Advisory Committee that endorsed additional research funding for next generation fundamental research in software and related topics. If you look at the last 30-35 years of network development, you'll find many people who have made major contributions without which the Internet would not be the vibrant, growing and exciting thing it is today. The creation of a new information infrastructure requires the willing efforts of thousands if not millions of participants and we've seen leadership from many quarters, all of it needed, to move the Internet towards increased availability and utility around the world. While it is not accurate to say that VP Gore invented Internet, he has played a powerful role in policy terms that has supported its continued growth and application, for which we should be thankful. We're fortunate to have senior level members of Congress and the Administration who embrace new technology and have the vision to see how it can be put to work for national and global benefit. "
  • by mkettler ( 6309 ) on Thursday December 04, 2003 @04:29PM (#7632404)
    RFC 704 (circa Sept 1975) states:

    "2. Expanded Address field. The address field will be expanded to 32 bits..." "This expansion is adequate for any forseeable ARPA Network growth"

    http://www.faqs.org/rfcs/rfc704.html

  • Re:Apple is dying... (Score:2, Informative)

    by Adm1n ( 699849 ) on Thursday December 04, 2003 @04:31PM (#7632440)
    Xerox's Alto's had a GUI/Network and E-mail while JOBS was still in shcool! Get it right! Or go here and lern a little. ;) Antique Pre PC Stuff [blinkenlights.com]
  • by runlvl0 ( 198575 ) on Thursday December 04, 2003 @04:32PM (#7632449) Homepage Journal
    Remember "Bob" from Microsoft? The predecessor to "Clippy"?

    The REAL legacy of Microsoft Bob [pmt.org], from Wikipedia [wikipedia.org]:

    Microsoft Bob was a project managed by Melinda French, who later married Bill Gates [wikipedia.org] to become Melinda Gates [wikipedia.org].
  • Re:9600 baud (Score:5, Informative)

    by Vihai ( 668734 ) on Thursday December 04, 2003 @04:35PM (#7632491) Homepage
    In facts, something like 3429 baud/s is the maximum *baud rate* of an analog phone line, 8000 is the maximum baud rate for a semi-digital phone line (V.90).
  • Re:download (Score:3, Informative)

    by kalidasa ( 577403 ) * on Thursday December 04, 2003 @04:37PM (#7632520) Journal
    Upload is local to remote; download is remote to local. Always been that way. Historically we used to think of upload as being client to server, download server to client because usually the server is remote and the client is local; but the definitions relate to the local/remote dichotomy.
  • by dacarr ( 562277 ) on Thursday December 04, 2003 @04:37PM (#7632528) Homepage Journal
    No, really, think about it. Operating systems don't actually "die". They kind of gain a cult following. Take a look at Amiga, OS/2, DOS, etc. Granted, they're all on life support....
  • Re:Overrated. (Score:1, Informative)

    by Anonymous Coward on Thursday December 04, 2003 @04:41PM (#7632587)
    Right now there's games starting to come out that actually require a Geforce3+ class card (shader support basically) or they simply won't run.

    Deus Ex 2 an Prince of Persia: Sands of time are two I can think of right away. If you've got a GF2 or an older Radeon card, it just won't work.
  • by tchuladdiass ( 174342 ) on Thursday December 04, 2003 @04:43PM (#7632618) Homepage
    It is NOT a limition of DOS, it is a limitation of the original IBM PC HARDWARE. You see, in 1981, the IBM PC was built around the Intel 8088 CPU, which could address 1024k of memory. The upper 384k was reserved by the hardware for the system bios, video ram, video bios, and any other board that needed memory-mapped I/O. Even the 80286 CPU had the 1024K limitation when it ran in "real" (8086/8088) mode.
  • by Col. Klink (retired) ( 11632 ) on Thursday December 04, 2003 @04:44PM (#7632636)
    You could boot from a USB flash drive...
  • by MrPerfekt ( 414248 ) on Thursday December 04, 2003 @04:45PM (#7632659) Homepage Journal
    He was campaigning, folks! What do people do when they want to get elected.. well, let's see they brag about things they have accomplished in the past. So without further ado, AL GORE DID TAKE INITIATIVE IN CREATING THE INTERNET.

    He fathered the bill that changed that odd, government and acedemic research network known as Arpanet into the Internet where people from all around can use it for all different sorts of purposes.

    So if he wrote the bill, does that not mean he didn't take initiative in creating the Internet? Would it not be unreasonable for him to bring up this fact while he was campaigning and trying to get people to see "Hey, look what I did!"?

    So please, get with it and stop political trolling. Thanks!
  • by pdhenry ( 671887 ) on Thursday December 04, 2003 @04:48PM (#7632698)
    ARPANET was developed before Gore even graduated college.
    Where are you going with this?

    ...and how many consumers were using ARPANET when Gore was in college?
    I think that's where he was going - there's a difference between "creating the internet" and "facilitating an internet economy" (which is what I wish Gore had claimed to have done).

  • by utahjazz ( 177190 ) on Thursday December 04, 2003 @04:49PM (#7632703)
    I believe OS/2 is destined to be the most important operating system, and possibly program, of all time

    And he was right. At the time, OS/2 and NT were the same OS. MS and IBM were working on it together. After the falling out, they forked. IBM kept the OS/2 name, MS kept the NT name. NT became Windows 2000, then XP.

    Amazing how much a name means to people. I'm sure most people think Windows 2000 was the next version of Windows 98.
  • Re:download (Score:3, Informative)

    by Jacer ( 574383 ) on Thursday December 04, 2003 @04:50PM (#7632722) Homepage
    It's called FXPing. It's really popular among warez users. It used to keep me on 0day sites with a mere dial up connection, dumping from one site to another.
  • by Josuah ( 26407 ) on Thursday December 04, 2003 @04:50PM (#7632724) Homepage
    "I believe OS/2 is destined to be the most important operating system, and possibly program, of all time. As the successor to DOS, which has over 10,000,000 systems in use, it creates incredible opportunities for everyone involved with PCs."

    -- Bill Gates, from "OS/2 Programmer's Guide" (forward by Bill Gates)


    I don't think it's that hard to believe Bill Gates thought OS/2 would be destined to be the most important OS of all time. OS/2 sure gave Microsoft a whole lot of free IBM research and development when they backstabbed IBM and launched Windows 95. OS/2 was ahead of its time, in terms of the technology and capabilities. In a lot of ways the hardware just wasn't there yet. But OS/2 certainly created an incredible opportunity for Microsoft.
  • by Glonoinha ( 587375 ) on Thursday December 04, 2003 @04:51PM (#7632728) Journal
    -Are you using a 64-bit desktop yet? Because if you're not, your 2003 desktop computer can't handle $5000 of memory!

    You obviously haven't bought memory from IBM or Dell for one of their servers lately. They are very proud of their 1G or larger ECC/Registered DIMMs.
  • Re:640K--not true (Score:2, Informative)

    by B3ryllium ( 571199 ) on Thursday December 04, 2003 @04:57PM (#7632823) Homepage
    You might run out of memory with some of the larger Mersenne primes.

    ((2^n)-1) is a fun equation.
  • Re:My favorite... (Score:4, Informative)

    by Wakko Warner ( 324 ) * on Thursday December 04, 2003 @05:05PM (#7632913) Homepage Journal
    Yeah, but Bob's now the pacesetter for Internet-Is-Dying predictions. He's the go-to guy for any journalist looking for a juicy story about its imminent demise (at least when he isn't writing about it himself in his InfoWorld editorials), and he's been saying the same thing since 1995. At least he was true to his word; when his prediction of a 1996 "catastrophic collapse" proved untrue, he literally ate his own words [computer.org] at the WWW6 conference.

    But then he predicted it again in 2000.

    - A.P.
  • by mec ( 14700 ) <mec@shout.net> on Thursday December 04, 2003 @05:20PM (#7632944) Journal
    A "server" in 1981 would be something like a PDP-11 or Vax on the low end. Such machines were more expensive than desktop computers, and had larger physical address spaces. Even a modest PDP-11/70 had 22 address bits.

    Most people preferred to spend $2000 on a PC with a 16-bit address space rather than $10000 on a PDP-11 with a 22-bit address space.

    I think that 20 address bits were plenty for 1981. The real problem was that there was no upgrade path for about 10 years after that. The Intel 8086 was 20 bits, fine. The Intel 80186 was 20 bits, okay. The Intel 80286 had "protected mode" addressing to increase the addres space, but it was nearly impossible for an operating system to context switch between "protected mode" and "real mode" (there was no instruction to do it, so an OS had to actually REBOOT THE PROCESSOR and then recover all its state on the fly).

    So until the 80386 came out, there was no way to get a new system with both (a) support for old programs and (b) support for more address space. And during that 10-year dry spell, that's when all those extendad / expanded memory schemes came out, and that's when the 1 megabyte limit really hurt.

  • by jbn-o ( 555068 ) <mail@digitalcitizen.info> on Thursday December 04, 2003 @05:21PM (#7632951) Homepage

    [...] responsible for helping to create I also invented the microphone [...]

    Your copy of the Snopes article is not what they posted. Anyone who actually read what you posted would have noted this glaring discontinuity.

    I can appreciate the clarification on Gore's "inventing" the Internet. But I think Gore gets too high a mark here and I'd like to point out why I think so as a side note to a comment I read in Snopes' essay.

    Snopes cites Vince Cerf saying "that as a Senator and now as Vice President, Gore has made it a point to be as well-informed as possible on technology and issues that surround it" but by 1999 (the copyright date on the Cerf page Snopes cites), Clinton/Gore had brought us the 1996 Telecommunications Act (which was a big step toward the media deregulation many groups across a wide political spectrum rail against today), the Digital Millennium Copyright Act [wikipedia.org], and the 1998 Copyright Term Extension Act [wikipedia.org]. So I come away thinking that Al Gore's legislative history deserves a more mixed review than Cerf (and Snopes) describe.

  • Re:download (Score:2, Informative)

    by HTH NE1 ( 675604 ) on Thursday December 04, 2003 @05:25PM (#7632985)
    downloading means going from a "greater" machine to a "lesser" on, such as from my XServe to my Powerbook.

    That definition of course breaks down when the machines are equal, such as transfers between two XServes or two Powerbooks, as well as when the two machines are one in the same, such as downloading a web page hosted on the same machine that requested it.

    Even when FTP'ing to 127.0.0.1, get is still downloading and put is still uploading. As it would also be if you FTP'd from a mainframe to a laptop.
  • by thejuggler ( 610249 ) on Thursday December 04, 2003 @05:34PM (#7633065) Homepage Journal
    Actually the first computer mouse [cedmagic.com] was invented by Douglas Engelbart of Stanford Research Institute long before the people at Xerox Park made the office of the future that featured a computer with mouse on each desk. See pictures [stanford.edu] of the first mouse.

    This does not mean that some one at HP never said people wouldn't want to have a mouse.
  • Clicky (Score:4, Informative)

    by Kjella ( 173770 ) on Thursday December 04, 2003 @05:34PM (#7633069) Homepage
    Ultimate Limits of Computers [arstechnica.com] (Yeah, yeah, I know copy, paste, remove slashdot-inserted space, it works too, BUT...)
  • Re:640K--not true (Score:3, Informative)

    by samf ( 18149 ) on Thursday December 04, 2003 @05:35PM (#7633074) Homepage
    sub factors($prime){

    return ($prime, 1);
    }

    1 isn't a prime [utah.edu]. :-)

  • Re:640K--not true (Score:3, Informative)

    by Fjord ( 99230 ) on Thursday December 04, 2003 @05:56PM (#7633286) Homepage Journal
    true, but it is a factor of a prime.
  • by Anonymous Coward on Thursday December 04, 2003 @06:23PM (#7633550)
    So you think ARPANET was the internet? ARPANET was not open to the public at all until 1983, when 45 nodes were split off from the military core. The rest became MILNET. In 1987 Al Gore commissioned Gordon Bell's report to the Office of Science and Technology calling for the integration of ARPANET, BITNET and NSFNET into a national research and education network. Gore also wrote legislation to greatly expand NSFNET and radically increase its bandwidth (from 56k to 1.544Mbps) so that it could serve as the core of this new integrated network.
  • Re:My favorite lie (Score:5, Informative)

    by AntiOrganic ( 650691 ) on Thursday December 04, 2003 @06:25PM (#7633566) Homepage
    Last year I'd have agreed with you. I had tried out various systems, with KDE 2.2 and Gnome 1.4, in addition to Fluxbox, IceWM and a handful of other window managers. It certainly wasn't pretty, and usability could've gone a long way.

    My, how things have changed.

    There are so many applications that do everything I needed to do on Windows, now. So you can't live without Kazaa? Download Apollon, the KDE FastTrack client. Need word processing? AbiWord/KWord are excellent pieces of work. Outlook got you down? Ximian Evolution has everything you need. Instant messaging? Gaim/Kopete. Music playing? XMMS/JuK will replace Winamp/Foobar quite handily. Graphics? The Gimp. Video/DVD playback? Xine tackles everything I throw at it. Development? KDevelop/xemacs. Web work? Quanta Plus/Bluefish. CD recording? K3b is every bit as good as Nero and is free. Web browsing? Konqueror/Mozilla/Firebird/Galeon/Epiphany. Usenet? Pan kills every similar offering on Windows.

    Additionally, KDE supplies me with various features that Windows can't match. I want to save an image from a website directly to my webspace, via either FTP or WebDAV? Right-click, "Save As," click "FTP" and Save.

    In addition, I paid $0 for all of the software on my computer, have ready access to the source code if I'd like to add a feature, and am not raped by vendor lock-in. I also am not subject to the ~30 holes in Internet Explorer this year, or worms like Blaster, Slammer or Welchia.

    There are really only a handful of things Linux isn't better at right now, and those are very, very steadily improving. The first and most obvious would be gaming, and even though older games like Starcraft and Diablo 2 run very well under Wine, newer games like Unreal Tournament 2003 are being released natively for Linux, there's still nowhere near the selection. I concede that; it's all about choosing the right tool for the job. The second is video editing, which really isn't very good on PC either with the sole exception of Adobe Premiere. I don't touch either of these things often, so it's not a tremendous deal for me.

    I wouldn't say it's good enough for Joe User right now, though. Package management and software installation still needs to be simplified for the average user (.deb should be the de facto standard, IMHO). Installation could be less painful if you don't know what you're doing. GTK+ needs a better file selector, admittedly, though I hold the opinion that GTK+ is generally worse than Qt to begin with, so I don't have trouble finding Qt-based replacements.

    My older brother, who has barely touched a computer in his life, can work at my KDE setup with ease. I consider this a small victory.
  • by Avihson ( 689950 ) on Thursday December 04, 2003 @06:41PM (#7633725)
    There was a link way up there, if you would have read it before posting it would lead you to an interview with Gates. Here is his response on hardware taken from Interview with Bill Gates [si.edu]

    BG: Microsoft was playing a much broader role[laughs] than just doing software for this machine. I mean whether it is the keyboard, the character set, the graphics adapter, or even the memory layouts. I laid out memory so the bottom 640K was general purpose RAM and the upper 384 I reserved for video and ROM, and things like that. That is why they talk about the 640K limit. It is actually a limit, not of the software, in any way, shape, or form, it is the limit of the microprocessor. That thing generates addresses, 20-bits addresses, that only can address a megabyte of memory. And, therefore, all the applications are tied to that limit. It was ten times what we had before. But to my surprise, we ran out of that address base for applications within -- oh five or six years people were complaining.
    (emphasis added for clarity)

    Occasionally, I do RTFA in advance of posting!
  • by Renegrade ( 698801 ) on Thursday December 04, 2003 @06:50PM (#7633792)

    I believe the Motorola 68000 CPU came out in like 1979, and it featured 24 address lines and 32-bit address registers, so was thusly ready for 16MB of RAM out of the box, and was transparently extendable to 4GB, with the addition of 8 more address lines on the 68020.

    The only downside that was, at least in the Amiga community, that some programmers who fancied themselves clever, used the upper, unused, 8 bits of the address registers for flags, and thus their programs died horribly on 68020s, which could actually physically connect to the full 32-bit address range.

    The 68K was a fine chip, with linear address space, and 8 general purpose data registers, and 7 general purpose address registers (plus one special purpose address register). It's such a shame we ended up with that kludgy intel beast. Sort of funny to watch a P4 or Athlon XP chip run MSDOS 5.0 with no emulation, though. ;p

  • by EinarH ( 583836 ) on Thursday December 04, 2003 @06:50PM (#7633795) Journal
    The so called exceptionally growth rate of Internet adoptation compared to that of radio and television:

    "It took 38 years for radio to attract 50 million listeners. 13 years for television to attract 50 million viewers. In just 4 years the Internet has attracted 50 million surfers! Those figures can hardly be balked at, especially when you consider the Internet's beginnings. "

    Well, it turns out that this dot-com myth is somewhat wrong and the growth is not so much stronger than radio and TV. [ifi.uio.no]

    Very interesting stuff, bumped into it on Usenet.

  • by Antisthenes ( 579582 ) <frogisgod.13.abs ... .spamgourmet.com> on Thursday December 04, 2003 @06:53PM (#7633816)
    Actually, that's Clarke's First Law [wikipedia.org].

    (See also here [lsi.usp.br] if you want to know what Clarke meant by 'elderly'.)

  • by JoeBuck ( 7947 ) on Thursday December 04, 2003 @07:30PM (#7634248) Homepage

    Moore's Law can't continue to hold out, period. That's because Moore's Law refers to silicon transistors, and you can't make a transistor with a feature that is less than one silicon atom thick.

    Intel and IBM both have demonstrated 65 nm experimental processes, though for volume production, 130 nm is the current state of the art. There are only eight more doublings left until the line width is less than the diameter of an atom (the diameter of a silicon atom is about a third of a nanometer). One doubling every two years means it's all over in 16.

    Now, we could possibly continue to increase circuit density for a long time after that by going to 3-D, but we would no longer be in the domain of Moore's Law: we'd be adding more transistors but they wouldn't be getting any smaller.

  • Re:I Invented... (Score:2, Informative)

    by jon3k ( 691256 ) on Thursday December 04, 2003 @07:40PM (#7634343)
    Gore Invented the Internet [sethf.com]

    "But it will emerge from my dialogue with the American people. I've traveled to every part of this country during the last six years. During my service in the United States Congress, I took the initiative in creating the Internet. I took the initiative in moving forward a whole range of initiatives that have proven to be important to our country's economic growth and environmental protection, improvements in our educational system. "
  • by Grimxn ( 629695 ) on Thursday December 04, 2003 @08:10PM (#7634550)
    Duh! The Church-Turing thesis proves "computational equivalence" of algorithmic machines of a certain standard. To validate your incorrect claim, you have to first prove that we are error-free, insight-free algorithmic machines. Error-free means no inferential errors (some humans make them), and Insight-free means no trans-rational mechanisms are involved in the inference: like drugs, madness, religion, fear, inspiration, individual experience (many humans' logics are trans-rational, maybe mine).

    We are no more computers than we were when Descartes conjectured that a thinking machine could be constructed by hydraulics, so impressed were the folk of the day with the achievments of that technology!
  • Re:640K--not true (Score:3, Informative)

    by Thuktun ( 221615 ) on Thursday December 04, 2003 @09:12PM (#7634999) Journal
    The site you quoted says, "The reason why 1 is said not to be a prime number is merely convenience." To paraphrase, 1 is omitted from the set of primes so you can say 6, for instance, is the product of two primes rather than three. If 1 were prime, you would have to add one prime to ever count of prime factors.

    However, 1 might be called a trivial prime, since it indeed only has factors of 1 and itself, also 1.
  • by weston ( 16146 ) <westonsd@@@canncentral...org> on Thursday December 04, 2003 @09:52PM (#7635244) Homepage
    the worst assumption many of us are making is that humans are not themselves computers. ...it's not just an assumption. There's some very lively argument over it. Penrose [dhushara.com] tends to the belief there are some non-computational processes that in the universe and they may underly consciousness.

    I'll point out here that I know that some of his arguments aren't watertight and the discussion is definitely in progress -- he knows this, as is evidenced by quotes like this from the article: "With apparently genuine humility, Penrose emphasizes that these ideas should not be called theories yet: be prefers the word 'suggestions.'" But they're as well supported as any other speculations about the nature of consciousness.
  • by Zeinfeld ( 263942 ) on Thursday December 04, 2003 @10:17PM (#7635376) Homepage
    Each halving of line width is a quadrupling of density, so your limit is pushed out to 32 years.

    The original estimate was off since the minimum feature size is not the same as the smallest dimension in the transistor gate. The limit is actually set by the gate width which can't go less than about 5 nm without the probability of quantum tunnelling occuring going above an acceptable limit (some leakage is OK but there is a point where it is not possible to distinguish the on state from the off...

    We have two more halvings of the minimum feature size before production silicon reaches state of the art and two more halvings after that before the show is all over folks.

    The doubling density each year rule corresponds to a halving of minimum feature size each 2 years. Intel say this is extending to 3 years (Bush recession, war on terror etc.) and the show is over completely by about 2015 at the latest.

    There will be cleanup work for some time but if you think about what you end up with at the end point it is quite interesting. You can fit 64 times the current number of transistors onto a chip. So you can comfortably fit your current state of the art Quad-Pentium processor with 4Gb of RAM all onto the same chip. So cut away the external memory bus completely and in its place add a couple of laser diodes on each edge and some receivers and you have a processing unit that communicates to its peripherals and any neighbors by means of optical couplings. It does not need special cooling either because only a small part of the chip area is CPU, the rest is memory.

    Instead of adding memory to this type of system you add more processor/memory units. You cound easily fit four, sixteen into a comodity PC box. The optical couplings mean that memory paging etc can be handled by making a request to a neighboring processor that stores that information.

    Big supercomputers are built by simply lashing a few hundred standard boards together.

    Back to the future, I was building these systems 20 years ago. Only then we called them transputers and they only had 4Kb of RAM

  • Re:My Mac Friend... (Score:1, Informative)

    by Anonymous Coward on Thursday December 04, 2003 @10:48PM (#7635556)
    I have a Mac friend who say his G5 is "faster than the Internet" becuase everytime he opens his browser he gets "a page not found messege" and has to hit the refresh button. I keep on telling him that its just a bug and his computer isn't faster than his broadband connection. But, he doesn't beleive me.

    Whatever the reason of his 404, comparing processor speed with network bandwith is a nonsense, bandwith-wise however the G5 can handle 6.4GBs, the hypertransport bus it runs on has a 12,8GBs bandwith so his G5 does handle more data per second than his broadband connection.

    Anyway, even then you can't compare the speed of your processor with your network bandwith since they simply are different things and do different stuff.

    And the gag about a dimwit saying his computer is faster than the internet was told to me by a guy about 6 years ago and it was concerning his win95 friend, maybe he switched recently!
  • Re:PacMan (Score:2, Informative)

    by Nonesuch ( 90847 ) on Friday December 05, 2003 @12:37AM (#7636070) Homepage Journal
    Computer games don't affect kids; I mean if Pac-Man affected us as kids, we'd all be running around in darkened rooms, munching magic pills and listening to repetitive electronic music.

    - Kristin Wilson, Nintendo, Inc., 1989.

    While Kristin claims having originated this joke, so does Marcus Brigstocke [erowid.org], and others attribute the quote to Steven Poole.

    It appears that this "joke" is actually only about three years old, google shows the sig file first appears on Usenet in on December 12, 2000, attributed to 'anon' or 'unknown'.

Anyone can make an omelet with eggs. The trick is to make one with none.

Working...