Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
It's funny.  Laugh. Technology

The Most Incorrect Assumptions In Computing? 1496

Miss Muis writes "After reading once again that Moore's Law will become obsolete, I amused myself thinking back to all the predictions, absolutes and impossibles in computing that have been surpassed with ease. In the late 80s I remember it being a well regarded popular 'fact' that 100MHz was the absolute limit for the speed of a CPU. Not too many years later I remember much discussion about hard drives for personal computers being physically unable to go much higher than 1GB. Let's not forget "I think there is a world market for maybe five computers" from the chairman of IBM in 1943, and of course 'Apple is dying...' (for the past 25 years). What are your favorite beliefs-turned-on-their-heads in the history of computing?"
This discussion has been archived. No new comments can be posted.

The Most Incorrect Assumptions In Computing?

Comments Filter:
  • Cringly (Score:2, Interesting)

    by Wyatt Earp ( 1029 ) on Thursday December 04, 2003 @03:46PM (#7631744)
    Anything Robert Cringly says is going to happen, won't and anything he says will fail, won't.

    He's my bellweather.
  • by andy@petdance.com ( 114827 ) <andy@petdance.com> on Thursday December 04, 2003 @03:48PM (#7631779) Homepage
    I wonder how many of the stories and anecdotes that will soon be posted are actually true, and how many are apocryphal.

    I've never seen a citation of the Bill Gates 640K quote, or the market for five computers quote, for example. They sound reasonable, but so are lots of supposed "facts".

  • by stroustrup ( 712004 ) on Thursday December 04, 2003 @03:48PM (#7631780) Journal
    the worst assumption many of us are making is that humans are not themselves computers.
    About Kurzweil [kurzweilai.net]
  • by Liselle ( 684663 ) * <slashdot@NoSPAm.liselle.net> on Thursday December 04, 2003 @03:49PM (#7631815) Journal
    "I believe OS/2 is destined to be the most important operating system, and possibly program, of all time. As the successor to DOS, which has over 10,000,000 systems in use, it creates incredible opportunities for everyone involved with PCs."

    -- Bill Gates, from "OS/2 Programmer's Guide" (forward by Bill Gates)
  • 9600 baud (Score:4, Interesting)

    by crmartin ( 98227 ) on Thursday December 04, 2003 @03:50PM (#7631818)
    ... is the limit for a voice grade phone line.
  • by DenOfEarth ( 162699 ) on Thursday December 04, 2003 @03:50PM (#7631823) Homepage

    I remember working at a research firm for an internship, and the head of our department said over lunch one day that he actually spent more time dealing with problems he was having with his computer than actually doing any useful work. I've noticed this with myself also, and even though I enjoy figuring out what's going on with my computer, I imagine many people don't. Email and websurfing always suck away my working hours, what with a PC right here on my desk, and not to mention that I get asked to help other people out with their machines every once in a while, it wastes both our time.

    Makes me think though...wasn't it always implied that computers would save peoples time? Has that assumption yet proved that it is indeed true? I'm not so sure it has, although maybe that's because we aren't using the things the right way. Perhaps we are waiting for a computer savvy workforce and then this might be true...but then again, who knows...

  • my personal fav (Score:1, Interesting)

    by Anonymous Coward on Thursday December 04, 2003 @03:51PM (#7631847)
    that the .com explosion would revolutionize every industry overnight, and make us all rich.

    now, almost 7 years later, i'm unemployed, taking more classes, and trying to compete w/ developers in India making 1/5th of what i was making employed.

    www == widespread wealth wipeout
  • by Alan Livingston ( 209463 ) on Thursday December 04, 2003 @03:52PM (#7631858)

    I remember telling my father once after he had bought a 40Mb hard drive that this should last him forever. Nothing could ever fill up more than this. Of course this was well before the days of .mp3 and .mpg.

    When I was a kid, I remember watching the Jetsons and when George came home from work he coomplained that he had just finished a hard day at work pushing buttons. I remarked to my father that Noone could ever get a job where all they did was push buttons all day. Now, except for the one knob on the 'scope under my desk, all my interfaces to the outside world ARE buttons.

    I guess I'm full of underestimations...

  • One year from now... (Score:5, Interesting)

    by zeux ( 129034 ) * on Thursday December 04, 2003 @03:53PM (#7631873)
    ... we won't need floppy disks anymore.

    It's been ten year that I hear this statement continuously. Last time I broke the MBR on a server without a CD drive, I had no other choice than to boot on a floppy.
  • Apple is dying... (Score:2, Interesting)

    by BWJones ( 18351 ) on Thursday December 04, 2003 @03:53PM (#7631878) Homepage Journal
    Apple is dying... has got to be my favorite for a number of reasons including most significantly, Apple has been the company that the rest of the industry has depended upon. Apple has been the personal computer industries R&D lab now ever since the Apple I. Just think about all of the firsts in Apple computers. First to build in color support, first to build in CDROM drives, first to include built in networking presaging the Internet, first to include a GUI, first to create the modern laptop format with palmrests up front, first to include a built in pointing device in laptops, first to etc....etc.....etc..... You get the point.

  • by homerjs42 ( 568844 ) * on Thursday December 04, 2003 @03:53PM (#7631887)
    bogus_prediction ::= (some_new_spiffy_language_that_actually_sucks) is the future of (computing|operating_systems|networking)+
    --dw
  • by Thud457 ( 234763 ) on Thursday December 04, 2003 @03:53PM (#7631893) Homepage Journal
    "This data processing thing is a fad" -- Gene Kelly
  • LISP is dead (Score:1, Interesting)

    by Anonymous Coward on Thursday December 04, 2003 @03:54PM (#7631919)
    from http://www.paulgraham.com/quotes.html

    "Lisp doesn't look any deader than usual to me."

    - David Thornley, reply to a question older than most languages
  • by sane? ( 179855 ) on Thursday December 04, 2003 @03:56PM (#7631957)
    "We'll have working speech recognition by 1990"

    "You won't have to work, machines will do everything for you."

    Flying Cars !

    Isn't it interesting that the only the failed predictions are the ones that people remember - no matter if they are exceeded or undershot.

    Its almost as if, if you want to be quoted and remembered, you need to make high sounding, but wrong predictions. The more smug the eventual reader, the more notice they take.

    "Microsoft will perfect intelligent software in their next release"

    "SCO will own all Linux IP"

    "The future belongs to Internet companies"

    "Genetic engineering is no more than a passing fad, forgotten by history"

    "President Bush will be recognised by history as a fine president"

    History, here I come.
  • Great Heinlein-ism (Score:5, Interesting)

    by Fished ( 574624 ) * <amphigory@gmail . c om> on Thursday December 04, 2003 @03:57PM (#7631973)
    This remind me of one of the aphorisms in Heinlein's Time Enough for Love:
    "If an elderly respected expert in a given field tells you that something can be done he is almost certainly right. If an elderly respected expert in a given field tells you that something is impossible, he is almost certainly wrong."
    Just think it, believe it, dream about it and it's real man.
  • I'm still waiting... (Score:2, Interesting)

    by caffein8ted ( 595864 ) on Thursday December 04, 2003 @03:59PM (#7632018)
    for a computer that can explain office politics to me.

    "In from three to eight years we will have a machine with the general intelligence of an average human being. I mean a machine that will be able to read Shakespeare, grease a car, play office politics, tell a joke, have a fight. At that point the machine will begin to educate itself with fantastic speed. In a few months it will be at genius level and a few months after that its powers will be incalculable." -- Marvin Minsky, LIFE Magazine, November 20, 1970
  • my favorites (Score:5, Interesting)

    by plopez ( 54068 ) on Thursday December 04, 2003 @04:01PM (#7632049) Journal
    The mainframe is dead

    "I don't understand why people would need more than 4gb..." (Bill Gates in an interview on 64 bit ccomputing, in which he said he didn't understand peoples' interest in it)

    XML will replace relational databases

    OOP will lead to more robust, easier to maintain and higher quality software

    By making COBOL resemble English, anyone can program.

  • Re:Video hardware... (Score:2, Interesting)

    by wed128 ( 722152 ) on Thursday December 04, 2003 @04:01PM (#7632055)
    or the assumption that software should come out slower than the hardware it can run on...
  • Machrone's Law (Score:3, Interesting)

    by anon*127.0.0.1 ( 637224 ) <slashdot@baudkaM ... om minus painter> on Thursday December 04, 2003 @04:01PM (#7632056) Journal
    As in Bill, editor of PC Magazine.

    "The computer you want to buy will always cost $5000"

    Now you could get 10 PC's for that.

  • Dot-Com bubble (Score:3, Interesting)

    by SexyKellyOsbourne ( 606860 ) on Thursday December 04, 2003 @04:02PM (#7632065) Journal

    Does anyone remember the whole Dot-Com Bubble [fuckedcompany.com]?

    Billions in venture capital were sent to silicon valley back in the late 90s in the hope that anything and everything internet-related could be profitable, and were worth investing in the same style that brick-and-mortar companies were. We heard all kinds of great things from leading economists who were really misleading us to manipulate the market, short the stock, and fuck everyone else over. Then, in 1999, after the Microsoft ruling, the whole thing kind of collapsed.

    As for today, just a few of the giants of e-commerce stand... so many companies went out of business on the predictions not far off from the ideas that we'd have groceries delivered to us over the internet (WebVan [wired.com]) or that we could actually stream TV-quality video over 28.8 kbps (Pixelon [wired.com]). It's never going to happen again, so the golden age of marketing ideas on the internet and obtaining massive capital influx is over.

  • Re:Home Computer (Score:5, Interesting)

    by arth1 ( 260657 ) on Thursday December 04, 2003 @04:04PM (#7632090) Homepage Journal
    I find the sales arguments for the first hobby computers the worst miss at all. Or is it just me who isn't using the computer to keep track of all my recipes?

    The second worst would be that we would be in a paperless society. Uhm, yeah, unfortunately some shmock invented wysimolwyg PRINTERs too.

    Other than that, I see new predictions fail all the time, and even being reinvented. Who else remembers the "Gorilla Arm Syndrome" of the 80's with touch screens? They were predicted to take over, but that didn't happen. And it ain't happening now either, with the flatbed computers -- touch screens just aren't ergonomic enough for any prolonged use, as most people can't keep their hands in the air for any length of time.
    Same with gyroscopic mice -- they're going the way of the Dodo, despite happy predictions.

    Regards,
    --
    *Art
  • Good Times (Score:5, Interesting)

    by gmuslera ( 3436 ) * on Thursday December 04, 2003 @04:05PM (#7632100) Homepage Journal
    By the times it come out, one of the favorite arguments against it was really a computer virus was "there is no way to automatically being infected with a computer virus simply reading a mail".

    Luckily Microsoft proved that assumption was false.

  • Re:Ken Olson of DEC (Score:2, Interesting)

    by Anonymous Coward on Thursday December 04, 2003 @04:12PM (#7632179)
    And another Ken Olsen classic:

    "Unix is snake oil"

  • Comment removed (Score:3, Interesting)

    by account_deleted ( 4530225 ) on Thursday December 04, 2003 @04:13PM (#7632198)
    Comment removed based on user account deletion
  • Erroneous Beliefs (Score:2, Interesting)

    by Anonymous Coward on Thursday December 04, 2003 @04:16PM (#7632230)
    While this is slightly off-topic in spirit, it isn't "by the letter of the topic"... . Really, the brunt of the joke is directed at "us" rather than "famous people" or "evil corporations".

    I remember a few silly beliefs some folks had when I was/we were young. The most remarkable thing is that some of them were "verified" by "scientific experiments" by various people.

    1. That burning a copy of a CD resulted in a slightly degraded image of the data. A classmate thought he verified this by copying copies until they failed. He came up with the figure that it took seven iterated burns (on average) for the degradation to make the copy unreadable(!!!). I guess some people don't understand causation and/or the law of averages and/or hardware reliability. This was in the days of turning off the music and not touching the desk while the CDR was burning.

    2. That data can just be compressed again and again (.zip, .arj, whatever)... I remember that this was how I learned "the pigeonhole principle", or, that there are 2^(i-1) programs that you can represent with i bits, but not with i-1 bits... This is possibly why I started following theoretical CS (although I hated maths back then) instead of programming/hacking. Keep in mind, also, that this "unlimited compression algorithm" was patented! This is the most blatant failure of the patent system I can think of: the claim is even MORE obviously impossible than those for perpetual motion machines!

    3. That compressed data was "more prone to read failures" than uncompressed data, by virtue of "the data being closer together on the disk". Although this might sound more ridiculous than #2, it really isn't. I fell for this when I was very young, as it seemed to be empirically verified. Heh.

    It is kind of fun to reflect on how all of these fallacies are due to extending what is intuitive about the real world, into the world of information and digital representation. We'll see how many current silly beliefs of ours (P!=NP?, "{absolute security|quantum computation|...} is (im)possible", &c.) have elegant refutations which we will hopefully discover in my lifetime. Remember, no one understands the world of quanta and bits yet, and that the opposite of a profound truth may be another profound truth.
  • by xpromache ( 542799 ) on Thursday December 04, 2003 @04:17PM (#7632243)
    who in 1950 said that in 50 years we will be able to programme computers "to make them play the imitation game so well that an average interrogator will not have more than 70 per cent chance of making the right identification after five minutes of questioning" 53 years later we are still so incredible far from this. see this [soton.ac.uk] for more details.
  • by HotNeedleOfInquiry ( 598897 ) on Thursday December 04, 2003 @04:19PM (#7632276)
    In the 80's, the Japanese industry/government complex declared a massive project to completely redefine and *own* operating systems and application software.

    Since at the time, they had finished doing just that with consumer electronics industry and were well on the way to doing just that to the automotive industry, most CS types were justifably concerned.

    Well, the rest of the story is that it didn't happen. Not even a whimper of it got over to the western world.

  • by MisterFancypants ( 615129 ) on Thursday December 04, 2003 @04:20PM (#7632285)
    "I think there is a world market for maybe five computers." - Thomas Watson, Chairman, IBM

    "There is no reason anyone would want a computer in their home." - Ken Olsen, Founder, Digital Equipment Corporation

    I don't think Watson's quote really fits into these sorts of discussions because the entire nature of what a 'computer' is was entirely different when he said it.

    Olsen's quote, however, is simple lack of vision since he was addressing fairly modern era PCs directly.

  • Cray (Score:2, Interesting)

    by CrayHill ( 703411 ) on Thursday December 04, 2003 @04:21PM (#7632293)
    Circa 1980:
    Someday everyone will have a Cray on their desk...
    This of course has come to fruition, but the corollary:

    ...but the rest of the desk will be the cooling system!

    fortunately is not true!
  • Correction: (Score:4, Interesting)

    by Thud457 ( 234763 ) on Thursday December 04, 2003 @04:22PM (#7632316) Homepage Journal
    "This data processing stuff is all a fad" -- Spencer Tracey, in The Desk Set [imdb.com]

    I just know that was driving you all nuts.

  • Dvorak on the mouse (Score:5, Interesting)

    by rtm1 ( 560452 ) on Thursday December 04, 2003 @04:24PM (#7632341)
    "The Macintosh uses an experimental pointing device called a 'mouse.' There is no evidence that people want to use these things." (John C. Dvorak, SF Examiner, Feb. 1984.)

  • by befletch ( 42204 ) on Thursday December 04, 2003 @04:24PM (#7632354)

    Ok, Gates claims he never said it. Great. I'd leave it at that, but I went to a talk he gave at the University of Waterloo in 1989, and he did meekly accept responsibility for that quote. We all politely chuckled, and the talk went on.

    I could easily be mistaken, as that was quite a while ago, but I distinctly remember it as a mea culpa.

  • Re:The death of x (Score:3, Interesting)

    by Hoi Polloi ( 522990 ) on Thursday December 04, 2003 @04:26PM (#7632379) Journal
    Don't forget that WWI was the "War to End All Wars" and the nuclear bomb was going to make war so terrible that it would dissapear. Didn't exactly work out. I think it is safe to say the "War on Drugs" and the "War on Terrorism" will follow the same path.

    I'll say this though: 8-tracks, betamax and vinyl records appear to be quite dead (said in my best Munchkin voice).
  • by T-Ranger ( 10520 ) <jeffw@NoSPAm.chebucto.ns.ca> on Thursday December 04, 2003 @04:36PM (#7632512) Homepage
    I think in comparing these two quotes, its important to see where their respective companies are now.

    IBM continues to be one of the leading (if not the leader) computer companies, and as a business has been around for more then a century, and has always been profitable. They clearly have recovered from a momentary laps in judgement, which, in historical context can be forgiven.

    DEC, on the other hand.. Well, Olsen was a dumbass, plain and simple. He also is quoted as saying "Unix is snakeoil". What is amazing is not that DEC got swallowed up by Compaq, a companies whose core business is putting computers in peoples homes, but that they managed to survive as long as they did with morons like Olsen at the heml.

  • by cpct0 ( 558171 ) <slashdot.micheldonais@com> on Thursday December 04, 2003 @04:43PM (#7632617) Homepage Journal
    People will never copy full CDs over the Internet, it is way too big and would take days by conventional modems (read: 14.4K)

    People will never copy full DVDs over the Internet, it is way too big and would take days by conventional broadband (read: 128K ISDN).

    -- that is for bandwidth.

    People will never be able to copy CDs, they are unreadable on computers except in audio D-A conversion.

    People will never be able to copy DVDs, they are encrypted with CSS.

    -- that is for format.

    People will never be able to copy GameCube games, they are on their own proprietary format discs.

    People will never be able to copy PSX/2 games, they have heavy protection.

    People will never be able to crack the XBox protection.

    -- This is for the consoles

    And my #1:

    This format is the next revolution! Jump in the bandwagon now!

    Mike
  • by mec ( 14700 ) <mec@shout.net> on Thursday December 04, 2003 @04:43PM (#7632626) Journal
    MS-DOS does not have a 640K memory limit.

    I've used a computer that had 900K of memory and ran MS-DOS just fine. All of it was conventional memory. No tricks.

    The 640K limit comes from the following architectural limitations:

    (1) Intel 8086 physical addresses are 20 bits long.
    (2) IBM partitioned the 1 megabyte address space into 640K of memory space, 384K of device space.

    Other manufacturers made MS-DOS computers that were not PC register compatible. Some of them did allocate more of the 1024K address space to memory. MS-DOS works just fine up to the physical addressing limit of the 8086.

    Back around 1981, I read a Byte article about the new IBM PC which said that it had a gigantic memory space. And they were right! Filling up that 640K would cost about $5000 at the price of memory back then. I think it's reaasonable for a personal computer to have enough address space to handle $5000 worth of memory (especially when $5000 in 1981 dollars is worth quite a bit more than $5000 in 2003 dollars).

    Are you using a 64-bit desktop yet? Because if you're not, your 2003 desktop computer can't handle $5000 of memory!

  • by Sophrosyne ( 630428 ) on Thursday December 04, 2003 @04:44PM (#7632637) Homepage
    Macs are just more stable and therefore better-- If anyone has used Panther they know that is not the whole truth. Upgrading to Panther has caused pain to many users leaving forcing them to either do a clean install or an archive and install (both being hassles for some), Or rendering some macs totally useless.
    Another example would be the quote "Apple's-- they just work". How many minor updates have hosed people's systems since OS X has come out, what about Keynote 1.0 causing kernel panics, all the former issues with iTunes, Safari, and all the other apps out there.
    So yes, I do use a Mac, and I do like it-- but it's time everyone knows the truth-- Mac's have many problems too, especially if you use new software... the bugs get worked out eventually, but usually at the users expense.
  • by alen ( 225700 ) on Thursday December 04, 2003 @04:45PM (#7632654)
    Do you work in the government? When I did I couldn't believe the amount of paper that was used and piled on people's desks. When I went to private industry the first thing I noticed was the lack of personal laser printers and desks free of piles of paper. At my company 90% of work is done through a CRM/ERP application and we are going more and more paperless every month.

    The latest paperless project is electronic faxing through zetafax.
  • by Nerdimus_Maximus ( 690239 ) on Thursday December 04, 2003 @04:56PM (#7632806)
    Wow, um, I used to work as a contractor for M$ and I remember VERY CLEARLY sitting in the lobby of the Microsoft Hilltop Building in Las Colinas Texas where the receptionist has a informational plasma screen (one of the first I'd ever seen, 1996) that basically running a screen saver of funny Microsoft quotes. And I specificially remember the quote "640K ought to be enough for anybody. --Bill Gates" Scrolling by. Considering I was *IN* a Microsoft building *AS* a Microsoft contractor and saw that quote scroll by on a *M$* plasma screen along with other phunnies, I tend to disagree with the people who are now posting that we're full of shit...
  • by TimboJones ( 192691 ) <<timbojones> <at> <timbojones.net>> on Thursday December 04, 2003 @04:57PM (#7632813) Homepage
    I don't believe I've ever visited an office that used an inkjet printer. Or at least, more than one that's used infrequently when color jobs are desired and there's no color laser.
  • Leisure society (Score:5, Interesting)

    by Nameless Poltroon ( 141361 ) on Thursday December 04, 2003 @05:00PM (#7632847)
    "Computers will lead to a leisure society where people have much more free time for personal pursuits and family"

    - my grade 10 high school teacher19 years ago
  • ESR 'Absurdly Rich' (Score:2, Interesting)

    by JK Master-Slave ( 727990 ) on Thursday December 04, 2003 @05:03PM (#7632886)
    "A few hours ago, I learned that I am now (at least in theory) absurdly rich." [linuxtoday.com]

    I looked on ESR's vanity page, and NO he doesn't have this clinker listed as one of his essays.
  • by esap ( 2010 ) on Thursday December 04, 2003 @05:03PM (#7632897) Homepage
    1. You cannot apply a technological solution to a sociological problem (Edward's law) [everyone does this, consider the people who wrote MS Word]

    2. OO represents "real world"
    [When did the real world start using 'CommandContainerFacade.getEventProducerFactoryCre ationCommand()'?]

    3. There is a magic product out there that solves all problems.
    [yeah sure, maybe in million years!]

    4. Methodology X is panacea. [see Usenet]

    Also see Anti-patterns catalog [c2.com] for other examples.
  • by milliyear ( 132102 ) on Thursday December 04, 2003 @05:07PM (#7632929)
    I heard this jewel come from the podium at the first Apple developer's conference in an auditorium in one of the suburbs of Chicago, around 1980:

    "Pascal is the language of choice for all future software development at Apple. If you want to write software for Apple computers, all of our development tools will support the Pascal language only. We both need one standard language to develop in and support, and we have chosen Pascal as the most popular and best language for development." (Or words to that effect)

    This was said by one of the technical suits at Apple at the time who's name escapes me. The 'conference' was actually a 2-3 hour presentation on a Saturday afternoon. It was sparsely attended (maybe 200 people total), which only filled the auditorium to about 20% capacity. A personal highlight for me was running into Steve Jobs in the hallway and having a chance to shake his hand and chat with him briefly, which was no small feat considering he already had a squadron of bodyguards.

    Obviuosly, the 'Pascal' proclamation was dropped within months. But it was encouraging to hear them acknowledge and attempt to support the needs of third-party developers.

  • by Gil-galad55 ( 707960 ) on Thursday December 04, 2003 @05:31PM (#7633042)
    You might find Shadows of the Mind by Roger Penrose interesting. He basically uses Godel's theorem to show that consciousness is not Turing computable, hence not implementable on current hardware. And before anyone waves the quantum computing flag, Penrose also points out that the problem with understanding consciousness is that the physical effects like on the border of quantum and macroscopic measurements, one of the most poorly understood aspects of physics.
  • 5 years from now... (Score:4, Interesting)

    by logicassasin ( 318009 ) on Thursday December 04, 2003 @05:31PM (#7633046)
    "...but 5 years from now
    everyone will be running free GNU on their 200 MIPS, 64M SPARCstation-5."

    Andy Tanenbaum, Creator of Minix
    30 Jan 92 13:44:34 GMT

    Andy wrote this during the "Linux is Obsolete" debate between Linus Torvalds and himself back in '92.
  • by schiefaw ( 552727 ) on Thursday December 04, 2003 @05:32PM (#7633051)
    Remember the "Internet appliances are the future" hype? No local applications or storage, just a bunch of dumb terminals connected to a paid service.
  • by valdis ( 160799 ) on Thursday December 04, 2003 @05:49PM (#7633213)
    Keep in mind that when TJ Watson said it, his company was *already* engaged in the sale of semi-programmable card-sorting and tabulating gear, of which they were building a LOT.

    What he *meant* was "There's a market for 5 really high-end machines far and above the rest of the competition". The word "supercomputer" wouldn't be around for a few decades yet. And what do you know? Even today, there's a small handful of machines at the truly high end (currently, above 5 teraflops or so)
  • by Anonymous Coward on Thursday December 04, 2003 @05:58PM (#7633296)
    > Paraphrased: "The internet is just a passing fad?" Back in '95 or so?

    I forget what magazine said that around those days, Bill gave a speech at some trade show. A reporter asked him if Microsoft planned to create an Internet division. He said that was as silly as if a computer company were to create an "electricity division", and basically laughed off the idea. A few months later, what do you know, MS announced their new Internet division...
  • by rossz ( 67331 ) <ogre&geekbiker,net> on Thursday December 04, 2003 @05:58PM (#7633300) Journal
    And now it's true. The last two computers I ordered, I specifically said "No floppy drive". One system was for my wife at her work (her company paid for it) and the other was for me (my new server). We both have 128Meg USB drives. Mine is a Laks watch, hers is a conventional type that she calls "the gadget".

    She loves her USB drive. In the past, when she wanted to bring work home (which is very often) she would either put it on a zip disk (which are too damn slow and are not reliable) or burn a CD, which was reliable but took too long. Floppy was out because the file was too big (MS Access database). Now she just drags the file to the "removable drive" icon and she's done. It's USB 2.0 (the fast one -- er, is that fast or high speed?), so it copies damn fast.

    Oh, the system can be booted from USB or CD, so crash recovery is still possible.
  • by Kjella ( 173770 ) on Thursday December 04, 2003 @06:02PM (#7633339) Homepage
    Unless you want to believe in a self-aware intelligent computer (think Skynet in Terminator movies) who has derived how to mimic human behavior (a more difficult task than simply *being* a human, it's not like we're concious of everything we do), isn't that really the downfall of programming?

    I think a computer of today would have more than sufficient processing power and storage space (particularly if it can do live Internet searches as an "extended memory") to imitate a human - there's just no capable program.

    Think about how you eat an apple. No, I wasn't really thinking about the chewing process, you can express that. Express how your body knows how to decompose the apple into various nutrients, absorb those into the body, deliver them to where they're needed, the chemical processes used to transform them into energy for our bodies, and how the byproducts are returned to the waste system, probably filtered by the kidneys and whatnot. Maybe now you can, if you're a doctor of medicine, but otherwise not. And people live and eat apples just fine without knowing.

    On the other hand, if you wanted to design an artifical digestive system, you'd need to know all that. In short, you'd have to know a damn lot. In the same way, humanity is pretty much stuck when it comes to describing how a human mind works. It doesn't help you at all that you see the brain in function every day, no more than you see a man chew and swallow an apple. There's simply no way to build artifical intelligence until we understand human intelligence. And when it comes to that, we're still way off.

    Kjella
  • by Jennifer E. Elaan ( 463827 ) on Thursday December 04, 2003 @06:11PM (#7633435) Homepage
    Actually, in real mode, a Pentium 4 has this same 1024K limitation. Even the Opteron is not immune. Real mode suck.
  • by bugnuts ( 94678 ) on Thursday December 04, 2003 @06:19PM (#7633505) Journal
    I believe it was the president of DEC at the time that asked "Who would ever want a computer on his desk?"

    Another bad assumption made, that my coworker just said, was "the Knapsack crypto algorithm is secure." The knapsack algorithm was a public/private key crypto system that was very elegant in the design and speed, but was eventually broken (on an apple ][, even).
  • The "640K" Quote (Score:3, Interesting)

    by DynaSoar ( 714234 ) on Thursday December 04, 2003 @06:34PM (#7633664) Journal
    As noted elsewhere, nobody, including Bill Gates, ever said anything about 640K being enough.

    The source of the quote was Steve Jobs, questioning Steve Wozniak's suggestion to build the "Language Card", the 16K memory card that took the Apple II/II Plus from 48K to 64K.

    Jobs' actual words were, "Why would anyone ever need more than 48K?" Not 64K, as assumed by the first misquoters, based on the maximum direct addressability of 8 bit processors, and not 640K as assumed by those who decided to misattribute the quote altogether.

    Jobs was always questioning Woz's technically oriented decisions, and frequently making the opposite decision when he had the power to do so. For example, he argued that there was no reason to build color into the Apple II. Woz did it anyway. When Jobs got the chance to make a similar decision, he went against Woz's reasoning, and even against the advice of others under him when making them. Hence, the original Macs, and several versions after, were strictly monochrome.

    I'd like to think Jobs learned his lesson after ignoring someone's advice not to hire "some soda pop selling suit" and losing control of his company for 10 years. But I could be wrong.

    Anyway, that's what I recall from my old "SoftTalk" and "The Road Apple" days.
  • You're not kidding (Score:5, Interesting)

    by joggle ( 594025 ) on Thursday December 04, 2003 @06:58PM (#7633889) Homepage Journal
    I remember reading that it now takes NASA substantially more man-hours to do the same tasks now than before computers were used for design/CAD work. If I remember correctly, it took engineers roughly half the amount of time to design a rocket like the Saturn V than it would today using CAD (Computer Aided Design)! Also, much more paper is used now then back then when all of the drafting was hand-drawn, with typewriters used for everything else. I think they also tended to make fewer mistakes because they were more closely involved in the numbers, not using a potentially buggy black box to help them out.
  • I can't believe it (Score:3, Interesting)

    by Flunitrazepam ( 664690 ) on Thursday December 04, 2003 @07:02PM (#7633948) Journal
    This many posts and no one has dropped the J word.

    When I was graduating high school it seemed the conventional wisdom was "In the future, everything will run on java anyway"

    This was just about the time I was getting into computers heavily, and I remember you couldn't buy a computer mag without having JAVA somewhere on the cover.
  • Re:Computer games... (Score:3, Interesting)

    by prockcore ( 543967 ) on Thursday December 04, 2003 @07:06PM (#7633994)
    Urban Legend. No one named Kristian Wilson has ever worked for Nintendo.
  • Re:I Invented... (Score:2, Interesting)

    by magickalhack ( 648733 ) on Thursday December 04, 2003 @07:41PM (#7634348) Homepage
    Public figures have practically no protection from either libel or slander. It's part of the gig -- they open themselves up to whatever the public wants to throw at them. They do have some grounds, but it's a far cry from the protection granted to individuals.

    Basically, it's a stupid argument. Anyone who has taken the time and bothered to actually look up what he really said will realize, immediately, that what he said made sense in context, was true, and that all the hoopla in the media was propagandist rubbish. Not that there's anything wrong with that -- that's politics. What is irritating is all the morons who don't know what they're talking about and yet still insist on sharing their uninformed opinion with the rest of us as if it was worth anything.
  • here's three more (Score:1, Interesting)

    by Anonymous Coward on Thursday December 04, 2003 @07:42PM (#7634356)
    "There is no reason anyone would want a computer in their home."
    - Ken Olson, president, chairman and founder of Digital Equipment Corp., 1977.

    "DOS addresses only 1 Megabyte of RAM because we cannot imagine any applications needing more."
    - Microsoft on the development of DOS, 1980.

    "Windows NT addresses 2 Gigabytes of RAM which is more than any application will ever need".
    - Microsoft on the development of Windows NT, 1992.
  • by Anonymous Coward on Thursday December 04, 2003 @07:57PM (#7634442)
    But I suppose sticking kids in schools, seperated from their family 5 times a week and put in a prison-like building with unknown people's offspring, somehow magically benefits them?
  • by Artifakt ( 700173 ) on Thursday December 04, 2003 @08:09PM (#7634541)
    Approximately true, but you can't make a transistor less than N atoms thick, where N is "thin enough to allow a significant probability of electrons tunneling". Depending on whether you want to allow a 5% error rate or 1%, or less, N is at a guess about 4 to as much as 16 nanometers (nm). The exact cut off is hard to fix, because it depends on just how much of the design you want to devote to error correction, but it's definitely there. Finding a way around it will take making small groups of atoms behave deterministically instead of according to Quantum Mechanics. That is unfortunately a hard problem. No one has a real clue as to how to solve it.
    What isn't yet clear is just what error correction itself means. Could a designer get a bit smaller scaling, but only by making the chip unable to run any existing programs? Could we turn quantum effects to our advantage with what is called Quantum based computing? Will Intel or IBM want to make a computer that needs a completely different approach to writing every last bit of software it can run?
    The answers to the first two questions are unknown. The third, however, is an obvious NO! Mor's law will stop, either because we can't make the switches any smaller, or because we stop using transistors.
  • Re:I Invented... (Score:5, Interesting)

    by FredFnord ( 635797 ) on Thursday December 04, 2003 @08:31PM (#7634705)
    You got a good ways. Now you just have to think.

    The question was 'What have you accomplished in congress?' or something similar. So now let's look at his response in that *CONTEXT*.

    Did Al Gore take the initiative IN CONGRESS in creating the internet? You bet he did! In fact, Newt Gingrich said that if there had been no Al Gore, there would be no internet as we know it today. (Of course, that was a few years ago. But still.) He was the prime mover behind getting funding for it. And without government funding, the internet would never have grown like it did, and may well still be some strange, escoteric thing that connects a few universities together... and AOL (or *shudder* MSN) could be the 'Information Superhighway'.

    So, you can still say that since he didn't explicitly SAY 'in Congress' in response to the question about what he did in congress, he was actually claiming to have invented the entire internet from scratch. But at that point, anyone with an ounce of intellectual honesty would have to admit that this was a 'lie' that was created entirely by the press and was perpetrated on an American public that is instantly ready to believe anything they hear, as long as it's bad.

    -fred
  • "Push Technology" (Score:2, Interesting)

    by rolofft ( 256054 ) <rolofftNO@SPAMyahoo.com> on Thursday December 04, 2003 @08:38PM (#7634750)
    Remember "push technology" [intranetjournal.com] circa 1999? "Active Channels" [microsoft.com] and "NetCaster" [netscape.com] were supposed to revolutionize the Internet. I hated the silly "channels" bar that popped up by default in Windows after IE 4 was installed. Yeah, Microsoft, instead of searching the Web for things I'm interested in, I want you to "push" your sponsors' lame content at me. Well, at least they caught on quickly and dropped it.

    For me this was another example of consumers ruling the marketplace with an iron fist. You can't get us to drive Edsels, drink New Coke, or subscribe to Active Channels, no matter how much money you have.
  • by solprovider ( 628033 ) on Thursday December 04, 2003 @09:20PM (#7635058) Homepage
    The grandparent's But if you're downloading data from a site, the site is not also uploading that data to you. The action exists at only one end of the operation, at the initiator of the action is correct.

    Continuing haystor's beer analogy [slashdot.org], the remote machine is called a server.

    Your machine requests something from a stationary location. That is a pull operation, and is called "downloading", (such as requestnig a drink and being given a beer.)

    Your machine sends something of yours to the stationary location. That is a push operation, and is called "uploading", (such as giving money to the bartender.)

    The remote machine responds to each request. It is "serving", (such as the bartender taking requests and returning drinks, also known as serving.)

    ---
    Another poster suggested that the definition has to do with the size of the machines, but this is obviously incorrect. If a 300lb man gets a beer from a midget bartender, the man is still doing the requesting and the bartender is still serving.

    Or think about P2P networks. The machines can be considered to be equivalent, but a computer with a 2GB hard drive and only 10 files still serves those files to the computer with a 200GB hard drive and millions of files. The latter computer is doing the requesting and "downloading".

    The confusion may be because your ISP is limiting your upstream or "upload" bandwidth, which is used for the transaction whether you are serving (also known as sharing) or uploading (also known as posting) the files, even though that bandwidth is also used for requesting. English is great; the last sentence had five words for the process where bits move from your computer to another.
  • by CarlDenny ( 415322 ) on Thursday December 04, 2003 @09:45PM (#7635204)
    Each halving of line width is a quadrupling of density, so your limit is pushed out to 32 years.

    As for going 3-D, even sticking to your strict definition of Moore's law, we'd still be fitting more transistors in the same chip area, so it'd count.

    3 years is a lot of time for stuff to happen, I suspect we'll get at least one more "free" doubling just from a leap in transistor design.

    More general, though, if some other technology comes along and we start using carbon transistors, or optical switches, or some more esoteric technology that allows us to do twice as may calculations in the same amount of time on a certain sized sliver of what-have-you, that's still going to be called Moore's law if it follows the pattern continuously, regardless of how hard you hold onto the stricter definition. If we shift to a new technology, and computing continues to double, no-one will be claiming that Moore's law is dead.

    If, not when. It could be that silicon transistors are the best we're going to do, and Moore's law (in either the stricti or the lax form) only has another few decades in it. It certainly has to peter out sometime.
  • by cc_pirate ( 82470 ) on Thursday December 04, 2003 @11:09PM (#7635654)
    "Computers in the future may weigh no more than 1.5 tons." --Popular Mechanics, forecasting the relentless march of science, 1949

  • Re:the list (Score:3, Interesting)

    by bwt ( 68845 ) on Friday December 05, 2003 @12:00AM (#7635891)
    [whipes tears of laughter from eyes]

    [breaths]

    I'd like to see a list of some of the competitors before I proclaim it THE funniest post. Perhaps it would be a good topic for a poll.
  • Microsoft collection (Score:3, Interesting)

    by axxackall ( 579006 ) on Friday December 05, 2003 @03:13AM (#7636817) Homepage Journal
    I like this collection [quotesandsayings.com] of misleading quotes from Microsoft, from Bill Gates and about Microsoft.

    My favorite ones:

    • "The Internet? We are not interested in it" -- Bill Gates, 1993
    • "Sometimes we do get taken by surprise. For example, when the Internet came along, we had it as a fifth or sixth priority." -- Bill Gates, Jul, 1998
    • "We had planned to integrate a Web browser with our operating system as far back as 1993" -- Microsoft (27 Jul 1998, filing its first court responses to federal antitrust)
    • On code stability, from Focus Magazine: "Microsoft programs are generally bug-free. If you visit the Microsoft hotline, you'll literally have to wait weeks if not months until someone calls in with a bug in one of our programs. 99.99% of calls turn out to be user mistakes. [...] I know not a single less irrelevant reason for an update than bugfixes. The reasons for updates are to present more new features.
    • Bill Gates, Free Market and the LA Times: "There are people who don't like capitalism, and people who don't like PCs. But there's no-one who likes the PC who doesn't like Microsoft"
    • Your most unhappy customers are your greatest source of learning. --Bill Gates, Business @ The Speed of Thought
    • "640K ought to be enough for anybody." -- Bill Gates
    • I don't think there's anything unique about human intellience. All the neurons in the brain that make up perceptions and emotions operate in a binary fashion. (Bill Gates)
    • "There won't be anything we won't say to people to try and convince them that our way is the way to go."
    • "We have no intention of shipping another bloated OS and shoving it down the throats of our users." -- Paul Maritz, Microsoft group vice president
    • On the solid code base of Win9X: "If you can't make it good, at least make it look good."
    • "Microsoft's biggest and most dangerous contribution to the software industry may be the degree to which it has lowered user expectations." -- Esther Schindler, OS/2 Magazine
  • by ComaVN ( 325750 ) on Friday December 05, 2003 @03:53AM (#7636944)
    You know wrong. Address 0 constains the vector for interrupt 0 (the divide-by-zero handler)

    From HelpPC 2.10, by David Jurgens (emphasis mine):
    - power supply starts Clock Generator (8284) with Power Good signal on BUS
    - CPU reset line is pulsed resetting CPU
    - DS, ES, and SS are cleared to zero
    - CS:IP are set to FFFF:0000 (address of ROM POST code)
    - jump to CS:IP (execute POST, Power On Self test)
    - interrupts are disabled
    - CPU flags are set, read/write/read test of CPU registers
    - checksum test of ROM BIOS
    - Initialize DMA (verify/init 8237 timer, begin DMA RAM refresh)
    - save reset flag then read/write test the first 32K of memory
    - Initialize the Programmable Interrupt Controller (8259) and set 8 major BIOS interrupt vectors (interrupts 10h-17h)
    - determine and set configuration information
    - initialize/test CRT controller & test video memory (unless 1234h found in reset word)
    - test 8259 Programmable Interrupt Controller
    - test Programmable Interrupt Timer (8253)
    - reset/enable keyboard, verify scan code (AAh), clear keyboard, check for stuck keys, setup interrupt vector lookup table
    - hardware interrupt vectors are set
    - test for expansion box, test additional RAM
    - read/write memory above 32K (unless 1234h found in reset word)
    - addresses C800:0 through F400:0 are scanned in 2Kb blocks in search of valid ROM. If found, a far call to byte 3 of the ROM is executed.
    - test ROM cassette BASIC (checksum test)
    - test for installed diskette drives & FDC recalibration & seek
    - test printer and RS-232 ports. store printer port addresses at 400h and RS-232 port addresses at 408h. store printer time-out values at 478h and Serial time-out values at 47Ch.
    - NMI interrupts are enabled
    - perform INT 19 (bootstrap loader), pass control to boot record or cassette BASIC if no bootable disk found
  • by mikelambert70 ( 547007 ) on Friday December 05, 2003 @11:16AM (#7638626)
    Has anyone else noticed that hard disk capacities have not increased for the past year?

    300 GB is still tops, same as last xmas. A minuscule growth in laptop hard disks, 12 months ago 60 GB, now 80 GB.

    I don't recall stagnation like this happening *ever* before.

"Engineering without management is art." -- Jeff Johnson

Working...