Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
It's funny.  Laugh. Technology

The Most Incorrect Assumptions In Computing? 1496

Miss Muis writes "After reading once again that Moore's Law will become obsolete, I amused myself thinking back to all the predictions, absolutes and impossibles in computing that have been surpassed with ease. In the late 80s I remember it being a well regarded popular 'fact' that 100MHz was the absolute limit for the speed of a CPU. Not too many years later I remember much discussion about hard drives for personal computers being physically unable to go much higher than 1GB. Let's not forget "I think there is a world market for maybe five computers" from the chairman of IBM in 1943, and of course 'Apple is dying...' (for the past 25 years). What are your favorite beliefs-turned-on-their-heads in the history of computing?"
This discussion has been archived. No new comments can be posted.

The Most Incorrect Assumptions In Computing?

Comments Filter:
  • by nanolith ( 58246 ) * on Thursday December 04, 2003 @03:45PM (#7631732)
    *BSD is Dying...

    Totally untrue. *BSD rules. :-P
  • This site... (Score:1, Insightful)

    by markjrubin ( 88076 ) on Thursday December 04, 2003 @03:47PM (#7631756) Homepage
    Slashdot: News for Nerds. Stuff that matters.

    hehe
  • Home Computer (Score:5, Insightful)

    by southpolesammy ( 150094 ) on Thursday December 04, 2003 @03:48PM (#7631792) Journal
    "There is no reason for any individual to have a computer in their home." -- Kenneth Olson, 1977, founder of Digital
  • Al Gore (Score:3, Insightful)

    by -Grover ( 105474 ) on Thursday December 04, 2003 @03:49PM (#7631795)
    Not technically "computing" but this is my All time favorite thus far.

    "GORE: Well, I will be offering -- I'll be offering my vision when my campaign begins. And it will be comprehensive and sweeping. And I hope that it will be compelling enough to draw people toward it. I feel that it will be.

    But it will emerge from my dialogue with the American people. I've traveled to every part of this country during the last six years. During my service in the United States Congress, I took the initiative in creating the Internet. I took the initiative in moving forward a whole range of initiatives that have proven to be important to our country's economic growth and environmental protection, improvements in our educational system.

    Shamelessley pulled from here [sethf.com]
  • by peterprior ( 319967 ) on Thursday December 04, 2003 @03:49PM (#7631800)
    Nope he never said that.. sorry... popular myth

    Here's a Wired.com article [wired.com] with some more details
  • Re:My favorite lie (Score:4, Insightful)

    by GoofyBoy ( 44399 ) on Thursday December 04, 2003 @03:52PM (#7631863) Journal
    Conversly;

    "Linux is good enough right now for the desktop."

    is being laughed at right now.
  • My favorite... (Score:5, Insightful)

    by Wakko Warner ( 324 ) * on Thursday December 04, 2003 @03:52PM (#7631870) Homepage Journal
    That idiot Bob Metcalfe loves trotting this one out every few years:

    THE INTERNET IS GROWING TOO FAST, AND WILL COLLAPSE UPON ITSELF PRESENTLY.

    I think he just wants everyone to know that he invented Ethernet, and needs to throw this story out there every couple years so people don't forget he actually did accomplish something at some point in time. Like 20 years ago.

    - A.P.
  • by Otter ( 3800 ) on Thursday December 04, 2003 @03:54PM (#7631909) Journal
    Ironically, what's being demonstrated here is that the most widely believed incorrect notion is that Bill Gates ever said "We'll never need more than 640K of RAM!".

    Back to the original topic, I'd point to the idea that sticking children in front of computers somehow magically benefits them.

  • by rcastro0 ( 241450 ) on Thursday December 04, 2003 @03:59PM (#7632017) Homepage
    Working as a consultant I am faced everyday with what I think is the biggest failed promise:
    That computers would bring about the "paperless office".

    Not only they didn't, but they made people consume more paper than ever before. On top of all the paper spent, the cost of printing pages increased, as industry made us believe that ink jets were better, and B&W laser passee.

    For more discussion see an article in Newsday about it [newsday.com]. There's even a full book dedicated to the question of why the paperless office never came to be [amazon.com].
  • Re:Al Gore (Score:2, Insightful)

    by Otter ( 3800 ) on Thursday December 04, 2003 @04:00PM (#7632043) Journal
    Like the 640K myth, there are multiple levels of "incorrect" related to that quote. On the one hand, you have people who believe "Al Gore claimed to have 'invented the Internet'." On the other hand, you have people who know that isn't true, but who don't realize that what he did say is also entirely false.
  • by kidlinux ( 2550 ) <<duke> <at> <spacebox.net>> on Thursday December 04, 2003 @04:01PM (#7632054) Homepage
    How about the need for computers in the classroom? That's total BS as far as I'm concerned.

    I've been into computers for quite some time, and am enrolled in Computer Science at university. It's been obvious to me for years that computers in the classroom are a waste of time, energy, and resources for everyone involved.

    I try to tell people this, and they wonder why I say that, given my experience with computers. No doubt it's because the people making the decisions have no clue.

    Most adults on /. likely went through school without computers in the classroom. Did our educations suffer as a result? No. As far as I'm concerned, I was better off in school without a computer.

    Of course, we did have computers at school. Good ol' ICONs, and IBM 8086s. We had typing class a couple times a week, and learned to use a word processor, which is about as far as it needs to go. Leave computers for their own courses in high school (Computer Science and maybe some kind of class for basics.)

    Is it not obvious that more harm is being done than good, when it comes to computers in class? There are just so many things wrong with the whole idea. Perhaps one day when computers become more appliance-like, they'll be more beneficial in class, and will be put to use in such a fashion as to not create dependancies.

    What do you think?
  • Re:Al Gore (Score:5, Insightful)

    by grasshoppa ( 657393 ) * on Thursday December 04, 2003 @04:03PM (#7632077) Homepage
    While it might have seemed a bit of an boast, it is, technically, accurate.

    Links:
    http://www.firstmonday.dk/issues/issue5_ 10/wiggins /
    http://www.firstmonday.dk/issues/issue5_10/wigg ins /#w4
  • SAP (Score:5, Insightful)

    by HexaDex ( 687539 ) on Thursday December 04, 2003 @04:06PM (#7632111)
    My fav is when our CFO asserted that when we migrated to SAP "we'd no longer need programmers". The sound you here is dozens of ABAPers laughing all the way to the bank...
  • by diesel_jackass ( 534880 ) <travis...hardiman@@@gmail...com> on Thursday December 04, 2003 @04:07PM (#7632121) Homepage Journal
    That whole y2k thing was pretty annoying. i could go on at great lengths, but didn't anyone else just set the date on their computer to a date in 2000(+) to see what would happen?

    (proof that fear is the best marketing tool)
  • by TygerFish ( 176957 ) on Thursday December 04, 2003 @04:07PM (#7632122)
    Moore's law is interesting and the immanent demise of Apple certainly so. However, the most interesting thing for me is how curiosity and greed work together to expand the frontiers in computers and what it's brought about.

    True, right now, the yearly, 'we'll-be-helpless-without-faster-computers!' cycle appears to have stopped or slowed down. Big IT buyers seem to have realized that you don't need a machine that could run a weather model to replace a typewriter and that's a real good thing.

    But what about software? I could be wrong. I don't do that much with my computer except surfing and writing, but much of what I see makes me wonder where all the really miraculous power of my computer is going.

    I've got an operating system that takes up non-trivial space on my harddrive and aside from a constant need to keep up with the virus writers, or dealing with stuff to make Microsoft happy, I'm not seeing the bennies.

    You'd think that with all this godawful power, there'd be a little more substance.

  • The Y2k lie... (Score:2, Insightful)

    by ivanmarsh ( 634711 ) on Thursday December 04, 2003 @04:09PM (#7632148)
    "A tiny pebble sends ripples across the entire pond"

    Things like this were said by MANY computer industry "experts" before Y2k.

    There are a lot of people that work very hard to make computers exchange information. It doesn't just happen.
  • MHz Myth (Score:5, Insightful)

    by Alizarin Erythrosin ( 457981 ) on Thursday December 04, 2003 @04:13PM (#7632188)
    That a higher clock speed means a faster processor.
  • by Wakko Warner ( 324 ) * on Thursday December 04, 2003 @04:13PM (#7632196) Homepage Journal
    The truth of the matter is that Al Gore, while he was a member of Congress, did indeed sponsor several initiatives which lead to the popularization and commercialization of the Internet. Did it exist before he showed up? Sure, as an underutilized academic research network. Would most of the planet know about it today without his help? Doubtful.

    Personally, while I may dislike the man, I'm tired of hearing the same tired, stupid jokes repeated over and over again.

    - A.P.
  • Re:one button (Score:2, Insightful)

    by zumbojo ( 615389 ) on Thursday December 04, 2003 @04:14PM (#7632205) Homepage
    My mouse has eight buttons. I wish it had more. If you think Macs are efficient with one button, think again. Combo keys are a pain in the ass when used on keyboards alone (like the cut/copy/paste functions), but when you have to use two hands to execute them (click + button) things really start to suck.
  • by Doc Squidly ( 720087 ) on Thursday December 04, 2003 @04:18PM (#7632258)
    Apple wouldn't be doing as well as they are now had it not been for a CPU transplant from IBM and an OS transfusion from FreeBSD.
  • by Anonymous Coward on Thursday December 04, 2003 @04:21PM (#7632296)
    No matter how he tries to spin it, he did claim to have taken the lead in "creating" the Internet.

    And no matter how the 'witty' people who post the 'Al Gore invented the internet' posts try to spin it, Al Gore never said anything even close to implying that he invented the internet.

    Honestly, I think people who post that are just making a joke now, although its one of the most worn out jokes ever and not really very funny, because of the deliberate misunderstanding that caused it to arise.

    It was clear from the context that he meant that he took initiative in supporting the development of the internet. Of course, political opponents of a politician(regardless of what side of the aisle they are on) will always latch onto interpretations that are make what the other guy said look more ridiculous than what was really meant, even when the actual meaning was obvious(not always the same as literal).
  • by pixelgeek ( 676892 ) on Thursday December 04, 2003 @04:22PM (#7632317)
    Frankly Gates' denial leaves me a little unsatisifed.

    If he didn't say it then who did? And how did the quote get attributed to him?

    Or who wrote the original article attributing this to Gates.

    Currently, AFAICT, there is only Gates' comment that he didin't say anything that moronic as "proff" that he never made the quote in the first place.

    Hardly a compelling rebuttal.
  • by acroyear ( 5882 ) <jws-slashdot@javaclientcookbook.net> on Thursday December 04, 2003 @04:24PM (#7632350) Homepage Journal
    "Nobody needs an imbilicul cord to their company." on why the Mac didn't have networking built-in.

    Now, of course, the Mac has no perifs and can only exist by connecting to a network unless you plug in external devices...
  • Re:Al Gore (Score:2, Insightful)

    by Otter ( 3800 ) on Thursday December 04, 2003 @04:26PM (#7632373) Journal
    It sure looks like he is claiming to have created the internet.

    Absolutely, and it's false. But myth has it that he claimed to have _invented_ the Internet (implying technical creation, not legislative creation) which is not an accurate chracterization of what he said.

  • by JK Master-Slave ( 727990 ) on Thursday December 04, 2003 @04:29PM (#7632412)
    So, you're saying since there's no definitive proof that a man didn't say something, but there's also no formal 'cite' proving that man did say it, that the safest assumption is that he did??
  • Disagree (Score:5, Insightful)

    by 2nd Post! ( 213333 ) <gundbear@pacbe l l .net> on Thursday December 04, 2003 @04:30PM (#7632419) Homepage
    I wonder if we can have a useful discussion without flames and insults? It is Slashdot after all!

    I think the problem is that computers aren't being used in their strengths: As long as you use computers as fancy notepads and chalkboards, computers are useless in a classroom.

    However, if you cater to their strengths and capabilities, I think computers are invaluable:
    1) Their ability to network and connect classrooms with other locations, such as other classrooms, servers with data such as photographs, maps, and things you can't store in a classroom.

    2) Their ability to virtualize. See things you can't afford to go see, do things you can't afford to go do, teach things you can't afford to otherwise teach! Books, encyclopedias, and videos offer a very static virtual representation, where a computer can be interactive! Not only can you 'see' different animals at various depths of the ocean with a computer (which a video can do just as well), you can *explore* too! Find out what happens at various pressures to your ship, to your body, see how snowflakes form, how ants find food; and then fiddle with a few settings, and see *different* snowflakes, see the ants starve, and see your ship crumple! You can design airplanes, and see if they fly or fall, you can create space stations, and see if your astronauts starve, overheat, or get bored to death!

    3) Interactivity. Very tied to virtualization and networking, you can interact with a computer in a way that you cannot with a video or a book. You can change things, simulate things, watch things, and then go back and change more things. You can have a classroom that happens to have access to a freshwater lake do experiments and research, connected to a classroom that happens to have a database, some programming kids, and a good grasp of math, and at the end of each day each classroom can learn things that before networking neither could!

    4) Data manipulation and storage. You can store lots of photographs, keep tremendous databases, perform tedious analysis, and create pictures out of raw numbers that a child, or even an adult, cannot. Measure the temperature, humidity, rainfall, pressure, cloud cover/sunlight, and wind at 400 locations 10 times a day across a city, and have the kids create programs to access, correlate, and manipulate that data and see if they can spot trends, correlations, and causations!

    So yes, there are reasons to have computers in the classroom. No, right now no one does it properly.
  • by letxa2000 ( 215841 ) on Thursday December 04, 2003 @04:33PM (#7632469)
    And no matter how the 'witty' people who post the 'Al Gore invented the internet' posts try to spin it, Al Gore never said anything even close to implying that he invented the internet.

    Hmm, according to Snopes [snopes.com], you are both right and wrong.

    I say that because Snopes classifies it as "False" but the explanation itself seems to be a spin. The quote itself on Snopes was:

    • "During my service in the United States Congress, I took the initiative in creating the Internet. I took the initiative in moving forward a whole range of initiatives that have proven to be important to our country's economic growth and environmental protection, improvements in our educational system."

    Snopes then goes on to say that it is rediculous to believe that Al Gore believed he created the Internet. That's not in question. No-one believes Al Gore created the Internet and I doubt Al Gore himself believes it, but the fact of the matter is he said: "During my service in the United States Congress, I took the initiative in creating the Internet."

    Perhaps, as Snopes concludes, it was simply a clumsy and self-serving phrasing that Gore used, but he did say it. I figure at worst it was self-serving and at best it was just stupid on Gore's part to say whatever it is he meant to say in that manner. But to say that others are spinning what Gore said is inaccurate. Many people jokingly mention it but, in the end, Gore DID say it in the above context--regardless of how much you wish he hadn't.

  • my list: (Score:3, Insightful)

    by Tumbleweed ( 3706 ) on Thursday December 04, 2003 @04:41PM (#7632582)
    1) Tablet PCs are the wave of the future!

    2) Blogs will amount to nothing.

    3) The MHz Myth
  • by caudron ( 466327 ) on Thursday December 04, 2003 @04:42PM (#7632600) Homepage
    Then what have I been using exclusively for the last 2.5 years?

    -Tom
  • by LSD-OBS ( 183415 ) on Thursday December 04, 2003 @04:42PM (#7632603)
    What do you think?
    I think you need to put forward at least a reason if you want anybody to listen.
  • by kalidasa ( 577403 ) * on Thursday December 04, 2003 @04:43PM (#7632623) Journal
    Xerox never commercialized the Alto, did it? Yes, the real invention was PARC's. Apple's innovation is at making computer innovation a product; Microsoft's is at making it a commodity.
  • Re:640K--not true (Score:5, Insightful)

    by mjh ( 57755 ) <mark@ho[ ]lan.com ['rnc' in gap]> on Thursday December 04, 2003 @04:49PM (#7632709) Homepage Journal
    First things first. The original poster didn't attribute the quote to Bill Gates. So a denial from Bill Gates doesn't mean that someone didn't actually say it. Second, someone had to have come to that conclusion, whether they said it or not, because that was in fact the limit. Third, if I were Bill Gates, and I *had* said that incredibly stupid thing, the chances are pretty high that (a) I'd lie about it later on, or (b) I'd forget that I said it.

    My point is that Bill Gates is denying it. Bill Gates also says that Microsoft is not a monopoly. Bill Gates saying something does not necessarily make it true.

    $.02
  • Re:Machrone's Law (Score:4, Insightful)

    by Patik ( 584959 ) * <cpatik AT gmail DOT com> on Thursday December 04, 2003 @04:54PM (#7632777) Homepage Journal
    "The computer you want to buy will always cost $5000"

    Now you could get 10 PC's for that.

    I don't want a $500 computer. I wish I could buy a dual-2GHz G5 with an Apple Cinema monitor, which brings you right around $5000. Looks like he was right.

    Another one: The things you want to download always require leaving your computer downloading overnight. In 1995 I had to leave my 14.4kbps modem running overnight to get MP3s, and now DVD-R images take about the same amount of time.

  • by Frequency Domain ( 601421 ) on Thursday December 04, 2003 @04:55PM (#7632781)
    ...to say that others are spinning what Gore said is inaccurate.
    I can't agree with this statement. Check out this article [somewhere.com] for a fairly thorough discussion of the topic. It shows the evolution from what was actually said to the distortions that became widely accepted and mocked.
  • So what?.. (Score:2, Insightful)

    by Roadkills-R-Us ( 122219 ) on Thursday December 04, 2003 @04:58PM (#7632825) Homepage
    The fact that he denies ever saying it doesn't disprove that he did it any more than the fact that he's quoted as saying it proves that he did.

    Regardless, I nominate Dell for building a 640MB limit into their X200 laptops. They'll take two memroy chips, and one can even be a 512MB chip. But the system maxes out at 640MB.

    But that's OK. It makes it easier for me to push the less expensive but slightly larger Latitudes for the engineers - who *always* want more memory. Not that I blame them.
  • by nullard ( 541520 ) <nullprogram@voic ... d.cc minus punct> on Thursday December 04, 2003 @04:58PM (#7632830) Journal
    Gore was talking about his work getting key funding bills passed that had an impact on the growth of the internet. That's it.
  • Re:DAMN IT. (Score:5, Insightful)

    by bigjocker ( 113512 ) * on Thursday December 04, 2003 @05:03PM (#7632892) Homepage
    A normal, sane person would understand it.

    Yes, as a normal, sane person, I understand it: he is 100% correct.

    Befor the Congress pushed for it's opening to the world, there was no such thing known as the 'Internet'; there was a closed network of universities and military computers (ever wondered what DARPA means?).

    He, as a congressman, was one of the main players in opening that network to the world, so he played a very important role (if not the most important) in the creation of the 'Internet'.

    It seems to me that the un-normal, un-sane person in this thread is, you.
  • by DavidinAla ( 639952 ) on Thursday December 04, 2003 @05:06PM (#7632920)
    I know this is difficult when you're in the middle of a mindless rant, but you might want to try to get the facts before you embarrass yourself next time (even if you ARE posting as an AC).

    Apple PAID for the rights to the stuff from Xerox. The facts are covered in numerous places if you'd like to trouble yourself to get a clue about this incident in computer history.

    Much of the rest of what you have to say is too self-contradictory to be worth responding to.
  • by jtdubs ( 61885 ) on Thursday December 04, 2003 @05:28PM (#7633018)
    "That business wouldn't be where it is today if they didn't do stuff to stay in business."

    Duh. What, do you think every other company exists in a little bubble where they survive soley on the work of internal employees?

    Justin Dubs
  • by ViolentGreen ( 704134 ) on Thursday December 04, 2003 @05:31PM (#7633038)
    Well I don't get that from that statement. The part about it being "attributed" tom him and to a lesser extent that he considers it a "silly quotation." Leads me to believe that either he did not say that or at least he is saying he did not say it.
  • by Ryosen ( 234440 ) on Thursday December 04, 2003 @05:37PM (#7633085)
    "File trading is killing the Entertainment industry."
  • by Anonymous Coward on Thursday December 04, 2003 @05:45PM (#7633165)
    For each of these incorrect assumptions, what was it that allowed people
    to demonstrate these assumptions are wrong?

    I've noticed that in the examples cited above, they are mostly assumptions
    based on numerical limits (100mhz, 1GB and 5). The question then becomes
    what was it that allowed us to workaround these "hard" limits?

    -cmh
  • by OwnedByTwoCats ( 124103 ) on Thursday December 04, 2003 @05:48PM (#7633186)
    Every sentence you wrote is false.

    Are you a Republican?
  • by russellh ( 547685 ) on Thursday December 04, 2003 @05:49PM (#7633210) Homepage
    the worst assumption many of us are making is that humans are not themselves computers.

    It's an interesting intellectual exercise, but the idea that we are merely computers is nothing more than the continued novelty of the computer, just as we once thought of ourselves and the world as clockwork. Wishful thinking, or perhaps professional myopia. Everyone thinks their field is the key to the universe. But this is not theory, so until someone can actually create complex life, I see no reason to believe people like Ray. Show me the money.

  • by Anonymous Coward on Thursday December 04, 2003 @05:58PM (#7633294)
    Actually, the previous AC is correct. Ancient Hebrew had a word for "murder" which was distict from the more general word "kill," and the more specific term was used.

    "Do not murder people," is probably the most accurate modern English translation. Various people who have only heard it as, "Thou shalt not kill," have often used it to trumpet that the Jewis/Christian/Islamic God is opposed to war, the death penalty, abortion, shooting intruders, or eating meat.

    The anti-abortionists might have a case, if they can demonstrate that a fetis is a person to the satisfaction of all concerned, but the rest are definitely examples of misguided people whose justification for their position is based on a common misquote (or rather, a mistranslation.)

  • by Anonymous Coward on Thursday December 04, 2003 @06:02PM (#7633330)
    Actually, they paid in stock for their 'theft.' Xerox couldn't market their inventions, Apple could (and made many improvements along the way). An (informal) deal was struck and everyone profited. So what's the problem?

    Oh wait, you were giving an example of an incorrect assumption. I misunderstood.
  • by Junks Jerzey ( 54586 ) on Thursday December 04, 2003 @06:04PM (#7633360)
    Yep, CPUs keep getting faster, byt high-end x86 processor speeds haven't anywhere near doubled in the last 18 months (and, yes, I know that Moore's law isn't really about speed). A year ago 2.4GHz was a common speed. And guess what...it still is. There was a jump to 2.8GHz--a 16% increase--but beyond that has been trouble. The few percent that got us up to 3GHz was more than balanced by a greater increase in power consumption. Ditto for 3.2GHz. And the 3.4GHz P4 has been delayed for just those reasons. So now we're going up a very steep slope, getting piddling gains for expensive tradeoffs.

    Moore's law *will* continue, but the advances need to come from a different direction than the one we've been following. It's already hitting the point where you just don't *want* a high-end processor in your laptop, because you have to keep it running much slower anyway just to get some acceptable battery life. The 3.4GHz Prescott is arguably something you don't want in your *desktop* as it is.

    Bottom line: Moore's law is no longer the most important concern in computing technology.
  • Re:my favorites (Score:4, Insightful)

    by timjdot ( 638909 ) on Thursday December 04, 2003 @06:25PM (#7633568) Homepage
    Good ones.

    I also liked that business progression predictions:
    1) people would migrate from MF (never did)
    2) people would not move from Netware (80% market share) to NT (did)
    3) home users will not use Lunix (dooh!)
    4) Now we have alongside Apple is dead, Sun is dead, and others.
    5) I remember in 97-98 we all thought 64 bit would be mainstream by 99-00. Boy did that not happen. Save Nintendo I guess.

    Oh yeah, what about distributed computing... DCOM will be, Web services will be, ...

    Surely nobody ever predicted that computer technology would head straight for whatever is the slowest way:
    Why program sockets when HTTP is 100 times slower?
    Why program to a relational database or object system when XML text is 100s of times slower?
    Why compile when we can interpret?
    Why run software on your cmoputer when you can connect to a terminal, web server, or host and do a 100 times less?
    Why not create about 20 layers between the application and the video card?
    Why hire experienced programmers when you can hire some with no experience 1/2 across the world and get the project done 100 times slower?

    Oh yeah, and then there's commodity computers. Everyone predicted that in the early '90's but the corp.s have successfully kept the prices high. Of course, with inflation we are starting to approach commodity computers.

    Finally the one about re-usable objects. Maybe sourceforge and open source projects like Apache are as close as we can get. In 94 I remember everyone figures there'd be online libraries where one could download whatever component was needed. Hah!
  • by theedge318 ( 622114 ) on Thursday December 04, 2003 @06:28PM (#7633601)
    This isn't actually as incredulous as you all might think. Sure simple statements like "*BSD is dying" are always going to get you in trouble. I mean look at the number of Commodore emulators. Some things just have niches, and wont go away (like Apple).

    But more importantly if you read the article carefully (which is a lot I know, this is /. after all). You will notice that Intel saying that Moore's law can't continue to hold out, with the CURRENT binary logic technology. Not that Moore's law will become false, just that there will need to be new advances in technology to overcome the limits. This is what has happened in the past as well (L2 Cache, Silcon -> Germanium, 3D chip pathways).

    Intel pays researchers lots of money to think outside of the box, but they need to write papers like this to keep the establishment continuing to realize them for the miracle workers that they are.
  • Y2K (Score:2, Insightful)

    by T9D ( 727450 ) on Thursday December 04, 2003 @06:59PM (#7633900)
    My favorite was that Y2K was going to be the end of civilization as we knew it, causing a major collapse in infrastructure. Whoops.
  • by Anonymous Coward on Thursday December 04, 2003 @07:10PM (#7634043)
    Yeah, right Zealots.
  • one word: Itanium (Score:3, Insightful)

    by JudeanPeople'sFront ( 729601 ) on Thursday December 04, 2003 @07:32PM (#7634268)
    One of the greatest shots-in-the-leg in IT business history was Intel's decision to stop developing the 32-bit Pentium processors line and design 64-bit processors without native 32-bit support.

    AMD couldn't have hoped for a better present from it's greatest rival. They have started building a new factory in Dresden (Fab 36) in anticipation of the increased demand of Opterons and Athlon64s.

    The desktops we will be buying in 2005 (2004?) will be 64-bit and it seams they won't be "intel inside".

  • Re:the list (Score:4, Insightful)

    by You're All Wrong ( 573825 ) on Thursday December 04, 2003 @07:36PM (#7634298)
    That is the single funniest post _ever_ in the history of slashdot.

    I stand in awe.
  • by Anonymous Coward on Thursday December 04, 2003 @08:33PM (#7634723)
    I'm sure you know the concept "building in ones own image"

    Think about a big modern building - it has a plumbing and electrical system to bring in the "nutrients" it needs to operate and a sewer and HVAC system to exhaust waste products. These days we even have telecommunications systems, fire and security systems, etc. You can certainly make an analogy to the circulatory system and the nervous system in the human body.

    But what about computers? Is there some analogy to be made to systems in the human body? Maybe you should read this:

    http://americanhistory.si.edu/csr/comphist/monti c/ cray.htm

    So, if we can say modern structures are a crude attempt to build in our own image at the basic plumbing level - so maybe computers are a similar attempt to build in our own image at the cellular level. And what is going on at the high end in modern computing systems? Massively parallel supercomputers for one! So you can make an analogy of many processors working in parallel to the cells in the human body doing much the same thing.

    Anyway - something to think about... I certainly find myself in the "humans are just fancy biological machines" camp
  • PacMan (Score:2, Insightful)

    by Symb ( 182813 ) on Thursday December 04, 2003 @09:00PM (#7634900) Homepage
    Computer games don't affect kids; I mean if Pac-Man affected us as kids, we'd all be running around in darkened rooms, munching magic pills and listening to repetitive electronic music.

    - Kristin Wilson, Nintendo, Inc., 1989.

  • by IBitOBear ( 410965 ) on Thursday December 04, 2003 @09:55PM (#7635256) Homepage Journal
    Eh, lets see:

    -- In five years, everybody will be using fourth-generation languages (our 4GL, etc) for everything except the lowest level of hardware support.

    -- You wont need programmers at all if you use our programmerless rule development interface. (See, NetExpert 8-)

    Basically, the any-idiot "enabling" technologies that were supposed to do away with all forms of having to know how a computer works.

    [Includes "death" of C and C++, Java, Perl, etc in favor of Power-Builder-esque symbolic/graphical program construction systems.]

    yea, sure... 8-)
  • by Grizzlysmit ( 580824 ) on Thursday December 04, 2003 @11:13PM (#7635684)
    That whole y2k thing was pretty annoying. i could go on at great lengths, but didn't anyone else just set the date on their computer to a date in 2000(+) to see what would happen?
    Oh there was a real problem all right, with a small subset of programs etc, but there was also a lot of shysters and a lot of hype and nonsense, and hardly any one would beleive us geeks when we tried to inject a little balance. Now of course they blame us for their hysteria :-D. humans ...
  • We are the Enemy! (Score:2, Insightful)

    by dreadlord76 ( 562584 ) on Friday December 05, 2003 @01:42AM (#7636397)
    Ever took an Early 90's software and run it on today's machine? The software back then was just as functional, runs screamingly fast on low end machines today, and ran in 1MB.
    The fact your browser takes 33MB to run is a problem. Every software wants to include the kitchen sink, that's a problem.
    Maybe if we have less abstraction layers, less dynamic invocations, less runtime discovery, and more focus on building something that works, we really would not need 4GB of RAM. Maybe, just maybe, the programs will run faster as well.
  • Re:Multitasking (Score:2, Insightful)

    by JK Master-Slave ( 727990 ) on Friday December 05, 2003 @03:13AM (#7636819)
    Windows before version 3.0 wasn't claimed to be a multitasking environment, except for the special case of 'Windows 386 2.1' which had some rudimentary protected mode support.

    Many of us ran Windows on 8088 and '286 machines for quite awhile before we could afford a machine with the '386 'virtual 8086' functionality, so we didn't have 'Pre-emptive mutlitasking' either.

    And I know that Microsoft made no unfounded claims about 'multitasking' on their early versions of Windows.

    What's your basis for your claims?
  • by firewrought ( 36952 ) on Friday December 05, 2003 @03:30AM (#7636881)
    People dismiss Y2K as a non-event. Something over-hyped and "mostly nonsense".

    I work in the power industry, and this attitude really pisses off some of my coworkers who spent thousands of man-hours remediating software and firmware systems one by one.

    Yes... Y2K did feature a lot of hype, but the response to the hype saved our ass. Engineers, managers, developers, even politicans... the human race came together on this one.

  • Re:640K--not true (Score:4, Insightful)

    by armb ( 5151 ) on Friday December 05, 2003 @11:03AM (#7638509) Homepage
    > If 1 were prime, you would have to add one prime to ever count of prime factors.

    Counting one as prime would mean there was no longer a unique prime factorization, because you could add one as many times as you liked to the list of prime factors.

    At the moment, the prime factorization of 12 is 3*2*2. If you allow one, it could be 3*2*2*1 or 3*2*2*1*1*1*1.

    Why this matters, I forget, but apparently if you are a mathematician it does.

Happiness is twin floppies.

Working...