Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Technology

Ask Slashdot: How Powerful is Your Computer? 237

Kurt submitted this interesting question: "All of us have, at one time or another, played the 'I remember when' game. For me it was 'I remember when the first accoustic coupler modems came out for the Apple II'. My dad remembers when programming meant soldering wires. I found myself recently sitting in front of my K6-233 wondering things like: 'At what time was the entire world's computing power equal to my 233?' and 'How soon will I be able to buy a machine more powerful than the world's computing power when I was born?' What I would like to see is a graph of the entire worlds (desktop-mainframe) computing power (machines times MIPS [or other power rating]) - preferably with some "giants of the day" plotted as a reference. Does anyone know of such a thing? I'm just looking for ballpark numbers, here."
This discussion has been archived. No new comments can be posted.

Ask Slashdot: How Powerful is Your Computer?

Comments Filter:
  • by Trep ( 366 )
    hehe, I like this. Took me a second to realize:).
  • Making such a graph for the very early years of computing should be relatively simple for those with the right data, since very few companies were actually producing computers. You'd have to get some records from IBM and a few others on how many computers (and of what types) they sold in what years, and that's it. No microcomputers to deal with, just the comparatively easy-to-count mainframes.
  • Posted by Trvln:

    Last year I was playing a game of Trival Pursuit.
    I answered a question before It was read. I got the question right. The answer was 5.
    It is possible to do this for any question. Just the odds are extremely small.

    Its the 1000 monkeys at a 1000 typewriter type scenario.

    So it is possible for a computer to answer a question before it is asked, and get it correct...everytime.
  • Posted by Imru al-Qays:

    Kurzweil's newest, The Age of Spiritual Machines, includes comparison tables for things like the total computer power available in 1950 and at present, plus some wacky predictions ...
  • Comment removed based on user account deletion
  • A CCII and your calling him a young pup!

    Weanies one and all. My first true home computer(as opposed to the many development systems that my Dad brought home from work) was a TRS-80 Model I...not colour nothing... sheesh.

    My first computer memory was of playing with a teletype when I was 3...which was only 22 years ago... so perhaps I'm a weanie too.

    For the uninitiated a teletype was what we used before VDUs....(printer + keyboard + paper tape reader/writer) I only remember the fast 300 baud ones... 75baud and lower was before my time...
  • Well, I set up Linux RH 5.1 on a Celeron 350 + 256MB RAM and it's VERY nice. It hardly breaks a sweat serving on-the-fly generated webpages from Oracle 8 database.
    you can find it here [delta.is], if you are interested

    Jón
  • I took a Compu Sci class called "The history of computing" a few years back. One of the textbooks was "Computer" by Martin Campell-Kelly and WIlliam Aspry. It's not to heavy into benchmarking or graphing performance, but it is a good read, and a nice historical reference on the prehistory and "early" history of computers.
  • Woopse that was dumb! corrected link below :(

    Here is a link to that book on Amazon Computer: A History of the Information Machine [amazon.com].

    It doesn't spend much time past the "PC Revolution," read, no windows 95 or Linux.

  • Woopse that was dumb! corrected link below :(

    Here is a link to that book on Amazon Computer: A History of the Information Machine [amazon.com].

    It doesn't spend much time past the "PC Revolution," read, no windows 95 or Linux.

  • you'd *never* have had cheap desktop kit without microsoft driving the market for machines to run its products. it would be terminals and mainframes only. in a very real way, bill gates is the father of linux.
  • the brain is just biology, which is just an abstraction layer on top of chemistry, which is similarly abstracted above physics. nothing non-computable there.
  • I manage a site where today we topped 110Mb/s outbound. We use FreeBSD with about 40 boxes. Most of them are doing banner counting and SQL database tracking over 10,000 sites with about 90M hits per day. One box that does just plain web serving (Apache) is a single PII-400/512Mb with UW-SCSI drives maintains a constant bandwidth of 20Mb/s with about 700 spawns of httpd. Load is about 2.5 on this box.

    When I came to the this company I was a die-hard Linux geek. Now I see that FreeBSD makes a better production server. I still use SuSE for my desktop as it configs so easy. I don't know if you can get the above numbers with Linux. If you have, please let me know.


  • What's the Travelling Salesman problem?

    dylan_-


    --


  • Wasn't that the same as a ZX81 over here (UK)? If so, there was a 16K RAM pack that you could slot in, though it tended to wobble a bit, so you should really tape it in. Then you could play games like Jet Set Willy, and I think Hungry Horace was a 16K game. Games used to say on the tape cover whether they were 16K or 48K (I have a ZX Spectrum 48K)....I remember when you could walk into John Menzies or Woolworths and see the shelves filled with games for Spectrum, Commodore64 and Amstrad.

    If Slashdot had been around then, you would have had ACs posting "Commie64 sux" followed up with "Go play with your rubber keyed beer mat while we use a *real* computer"...the only thing everyone agreed on was that Amstrad was crap :-)

    dylan_-


    --


  • Oops....well, I never had a ZX81....I thought with the 16K pack it could run the same games the Speccy could....well, you live and learn :-)

    dylan_-


    --

  • But the fact is that the ease-of-setup and ease-of-use on a Linux box, for non-technical people, has helped make Windows the dominant OS.

    Umm, that suggests that Linux was actively competing with Windows for users. When Win3.1 came out I think Linux had what, a thousand users? Even until the last few years Linux wasn't even a blip on Microsoft's radar. Linux never lost the battle..it hasn't even started yet.

    Daniel
  • A 486 running apache can saturate a T1. You only need bogomips if you're serving a lot of dynamic stuff. Like, if you're slashdot, for example.

    I wouldn't overclock a server, not even with one of those rock-solid 450a jobs. Overclocking is for gaming (and that's what I use my celeron for :).
  • You have been reading too much Penrose... :)

    Frankly, I still don't know what consciousness has got to do with quantum mechanics. AFAIK there is no evidence for quantum effects being relevant to the performance in the human brain.
  • Well, I'm gonna try to not toot my own horn too much, but how about a palm-sized 386:

    http://www.rim.net

    Complete with wireless networking.
  • There was very little in Win95 that wasn't in the Amiga or some other OS at least 5 years before. Long filenames, 32-bit, preemptive multitasking, desktop-based file manager, auto-configuring hardware, ... All old news. The only innovations of Win95 were in the areas of marketing and licensing.

    Yes, M$ made a big leap from CP/M, er, PC-DOS, er, MS-DOS to Win95, but when you're starting from that far in the hole, even someone else's 5-year-old technology is going to look good.



  • As least get your history right. That title goes not to MS, but to Xerox, who invented the concept of the GUI, and to Apple, who perfected the concept of the GUI. Not to M$, who rather wrecked wrecked the concept of the GUI yet whom most other interfaces copy for some totally unknown reason (though most manage to undo a few of the mistakes).

    Let's talk about the old computers, shall we? Did they have Windows? No. But better interfaces existed before than, most notably in MacOS. Did they have Word? No, but before Word came along other graphical word processors existed. I might add something here: the first version of Word in the form we know it today was written for the Mac.

    Get your history right next time you shoot your mouth off.
  • The Lisa was apple's first GUI-based computer, not Xerox'.

    Xerox did develop the GUI (along with Smalltalk, ethernet, the mouse and a number of other things) at their Palo Alto Research Centre (PARC). It was a lot earlier than 1980, though. Actually, around 1974, I believe.

    And it was called the Alto.

    Although they did in 1981 release the Star, which was a commercial version of the Alto (which wasn't much more than a prototype), it never really sold much.

    But you were heading in the right direction :-)

    - Sean


    - SeanNi
  • I've run several medium traffic (1000 or so hits/day) database driven dynamic sites of an old 486/25 with an overdrive to p63 and 24 megs of ram. Runs just fine.
  • This is a table I got from a textbook
    Computer Organization & Design,Hennessy & Patterson

    http://www.cis.ohio-state.edu/~mahrt/table.html
  • Not delivered yet, but I only paid $475 for it (Mobo and chip, both new).

    I hope I can find memory for the bastard.


    --
  • www.dcginc.com

    It's only an SX, and they told me I got one of only 2 remaining. They have LX's and up as well, but for a little more money.


    --
  • While I will agree that the Commodore manuals (yes, even through the Amiga years) were excellent for their technical resources, I MUST mention the manuals which came with the Apple ][ and Apple ][+ computers. Three, to be exact.

    • General users manual - says it all. Nothing much to techies.
    • Applesoft Programmers Manual - originally the Integer basic manual, but this is what I learned programming with from 1980-1983.
    • Apple ][ Technical Reference Manual - Now this is what I'm talking about. Not only did it give you pinouts and enough information to build a card for the darn thing, but it gave you a fold out SCHEMATIC of the motherboard! I still have this on my bookshelf at home. Nothing will ever come close to this out of a home computer box.

    My two cents.

  • A-men to that. I remember my first computer from 5 years ago:
    an old TRS-80 Color Computer (model 1 if you could say) which was $5 from a yard sale.
    The computer came with nothing but a BASIC manual and another "user's manual". It taught me everything I never
    wanted to know about how the machine's ports work. Moreover, the BASIC manual was excellent.
    In the back it even explained how to do lowlevel graphics programming (since that's
    all you could do, well, not really since that computer had Extended Color BASIC instead of just Color BASIC, but anyway)
    with the machine, plus a few small pointers on assembly language (although I never did get into ASM, and never have yet)
  • I really think we all have too much time on our hands.. and we also must remember.. its not the size of our process0r.. but what we do with it..

  • Oh DUH!@#! I forgot when an anon. person tells me to die im suppose to.. tsk tsk you Anonymous Coward

  • I've got to agree. The 89 is very nice. As a freshman engineering student this year, I thought about buying a 92. Then someone showed me a TI89. It does everything a TI92 does, and nearly all a 92Plus does. It's also smaller, and about $30 cheaper than a TI92. Very good calculator. Oh, it's running a 68000 chip.
  • My first crystal set actually used galena, not germanium. ;-)
  • Yep. The C-64 documentation rocked. I remember three books -- the manual that came with the computer, the Reference Manual (which actually included a schematic of the computer), and a book titled "Inside Commodore DOS" which detailed everything about the 1541 disk drive, including a disassembly of the operating system. I still have them around somewhere. They were a hacker's dream.

    One could literally teach oneself BASIC, Assembly and all kinds of programming tricks with those things.

    If documentation of that quality were available for today's PC's, I would cry for joy.

    Kythe
    (Remove "x"'s from

  • Um, no.

    Apple, Power Computing, IBM and Motorola are largely responsible for the computer I use today. Microsoft is responsible for little more than ensuring that I have trouble finding software to run on it.

    Microsoft hasn't done a damn thing for the computer industry, other than keep the hopelessly archaic X86 platform afloat. Yay?

    - Darchmare
    - Axis Mutatis, http://www.axismutatis.net
  • "Windows 95.. Brought to you by the creators of EDIT."

    actually, i think that the proper quote would be ``Windows 95, brought to you by the creators of edlin.''

    ever had to use edlin for anything serious? not good for your health.
  • You're talking about the great leap in price, right?
  • A computer will damn well know the answer before it gets the question. It just cannot respond before it is being asked. Just because I didn't ask my computer the result of 2+2 yet doesn't mean it doesn't know it.
  • I second that!

    Management was ready to buy new servers this year for our offices (we have a Pentium Pro 180).
    Trying to be the "contious employee" I told them they barely get any utilization as it is (primarily file/mail/print servers), if anything they should buy some more storage and maybe ram.

    Federico
  • Actually, there have been 5 forms of falsehood:

    1) Lies
    2) Damned Lies
    3) Statistics
    4) Benchmarks
    5) MicroSoft Product (schedule) Announcements

    Add to this a 6th:

    6) Presidential Testimony.


    (As for the last- Some of the Democratic Senators wanted Pres. Clinton to testify in front of the Senate but it took over an hour for the Republican Senators to stop laughing and explain to them that they had no way to swear him in.)

    -soup
    (No man can learn about impotence the hard way.)
  • Yeah. That great leap from DOS to Windows, to Windows 2.0, to Windows 3.0, to Windows 3.1, to Windows 3.11, to Windows for Workgroups 3.11, to Windows 95. That was a MAJOR hurdle :)

    This is not to say that M$ is evil (though they may be) or that linux is good (though it is), but really -- after 10+ years of development, you'd kind of *expect* some leaps, wouldn't you?
  • My ballpark answer to your 1st question would be, around the mid-to-late 1950's. This guess-timate is based on several factors. Univac started delivering commercially available computer in the early 50's. IBM followed a few years later and sold mainframes in volume (100's per year) around the mid-50's. Around this time, the 1st all transitor computers were developed and started appearing by 1960.

    Now these computers were extremely slow, but the shear number of them would would make up for this deficiency.

    Another question would be, "When was the time when the fastest commercially available computer had the same processing power as a 233Mz Pentinum?" The answer to that one would be, around the late 60's (I think). BTW, the Cray I (which appeared around 1977???) had a "speed" of about 150 mega-flops.
  • Yeah, they seem pretty nice. It's just when you need to get real work done on a HP/UX (HP 9000) system, their own OS gets in the way. Let's take for example, a processor goes bad on a HP 9000. You have to power the whole system down and remove the processor. With a Sun box you just power the processor down and replace it, power the new processor up, and you're back in full business with no downtime.

    What HP really excels at is printers, calculators and test equipment. Their servers are a pain to deal with at best beyond, "ohh, a problem, let's reboot."
  • When I did a back-of-the-envelope calculation a few years ago, the figure that I came up with was about 2020 for the computational power of a human brain in one computer (measured in operations/second vs neurons firing/second). Given differences in architecture, software problems, etc I would assume that a general purpose computer roughly capable of the same intellectual tasks as a human brain should be a reality by 2030 or so.

    And then things get interesting.

    AI so far has completely flopped. Well so what? What is the computational power and memory of a brain? Of a computer? Some problems are only addressable in hardware, and I honestly believe AI to be one of them. So when the AI folks finally have the necessary computational power, I think that AI will prove to be like chess-playing, like speech recognition, like lots of problems. Quite solvable with the right equipment.

    But then we have a very nasty situation. For $70,000/year you can hire this human, or for $1000 you can buy a computer capable of the same job, but it works 24 hours a day. Do you want to hire or buy?

    This dilemma has arisen in the past and the human almost always loses. However historically humans are more flexible and so there have always been plenty of other things that talking monkeys can easily do and machines can't yet. So with the machines handling the repetitive stuff, and the monkeys doing more interesting stuff, we all wind up generally better.

    But what do the monkeys do when there is *nothing* that the machine can't do better? Sure at that point it is theoretically possible for us all to just be provided with the necessities of life - but does anything in the history of our economic system indicate that that will happen? Not that I can see!

    I am not a techno-phobe, but I can tell you that I am preparing to have enough saved that I won't economically need to work after that point...

    Ben Tilly
  • I think the microsoft's competitors get the real credit for advances in pc computing (assuming, of course, and without any evidence, yet) that another OS or platform wouldn't be on top if Billy G. hadn't bought QDOS to begin with.

    Raise a glass to:

    The various incarnations of (now) caldera's OpenDOS, OS/2, MacOS, AmigaOS, etc.

    All the third party vendors driven under by Microsoft bundling.

    Thank the folks who wrote Winsock.

    Thank the original author of QDOS, too.

    And, of course, CP/M

    These are the people microsoft have copied, and the reason why MSDOS 1.0 isn't still the default OS installed on the PC's of the world.
  • Why not just find out how many of each processor was sold and add it out after you multiply the #of proceccors and the power of each. Of course the question is doomed ot be inacurate, but it's all in fun isn't it?

    matguy
    Net. Admin.
  • When I joined Digital in 1984 I was hired into the UNIX group at Merrimack, NH. On Day One, they gave me an email account: decvax!ccb. They also gave me a cube and a brand-new VT100. I took it out of the box and connected it to the serial cable in my office. At the other end of this cable was "Abyss" - a DEC VAX/11-780. A DEC VAX/11-780 had all of the computer power of perhaps a 386DX25. We probably had about 500mb of disk space and I think we had 4MB RAM. My terminal was one of 40 or so. We'd all be on there awking and grepping and running vi and using the compiler and all having a great time. Abyss was running a new operating system called "ULTRIX" - a newly commercialized version of 4.2c BSD UNIX.

    Within a year we introduced the uVAX-II/GPX at roughly the same horsepower level and the world's first commercial release of the MIT X Window System. This was a machine that topped out at 9MB RAM and was designed for a single user...

    In those days a UNIX kernel weighed in at about 300kb (we didn't compress kernels in those days) and you could actually install a fully functional UNIX system in under 27MB of disk space with plenty of room to spare.

    My current computer (k6/2-300, 4.5GB, 128MB RAM) was pretty close to inconceivable as a home computer six years ago when I bought my first PC (486dx/33, 340MB, 8MB RAM).

    All that being said, unless we start to go nutty with voice processing or mondo 3d, I think the current generation hardware is a mature plateau for the desktop platform.
  • Yeah,
    It started shipping with the commodore 128, I think. It's too bad commodore got the commercial rights for Amiga in the states, much better design than today's computers...

  • whell all i can say is when are we going to have the computing power of some of the worlds fastest computers in a laptop. take a look at this on top500.org [top500.org] ,thier amazing. if thoes are the fastest computers, imagin how many pentiums and clones exist, and what if, just what if we installed pvmd on every computer in the world and each had a 1000BaseTX connection to each other and higher with the faster computers, what type of computing power would we have, only we can dream.

    whell thats a thought ive had a lot, once i understand how to program for pvm ill take all my computers and all my roomates computers and link them :). i also had a "i rember when" conversation with a friend that was telling me when he was in the army the he was in the first team to use digital equptment :), and i sit hear and play with old AND and NOR gates, and think of how they even made a processor out of them. just my 2cents.
  • In what year will the world's computational power be enough to simulate a neural network of human brain complexity in real time?

    Then... In what year will the power to simulate a human brain be contained in a single supercomputer?

    Then... In what year will the power to simulate a human brain be contained in a computer the size of a human head?

    I read a book by Tippler that was really good and he talked about it. I think his conclusion was about 2060 to get the simulated brain in a desktop computer. He's probably way off, because projecting the future is difficult, but it's still interesting to think about.
  • You can't argue that computers will never be intelligent because they can't do Travelling Salesmen.

    I don't know a single person who has the ability to do it either.
  • by PD ( 9577 )
    Did 100 MFLOPS in 1976 when it was introduced.

    My Pentium 133 does 21 MFLOPS according to benchmarks.

    All you people with 450 MHZ processors must be somewhere in the ballpark of a Cray I.
  • CricketGraph rules! Saved me much frustration during many chemistry labs. and even a few physics ones...

  • My first computer (well, not mine really, but..)
    was an SDS-920 (paper-tape reader , paper-tape punch), and then an SDS-930 which had a PRINTER
    (ooops, I meant a printer, but it was loUD...)
    The came PDP-11 and HP2107. And they actually
    had some decent debuggers. Then I had a Compaq
    386 with MS windows something or another on top
    of DOS 3.31 (it was a Compaq remember?). I was
    so visually shaken by the inferiority of the thing, I got rid of it after six months. Then came Windows 2.0 , at which time I finally decided to spring $5500 for a refurbished SparcStation 1, in
    1991. 1992, we added a Sparcstation 1+, and 1993
    we got a Mac 840AV. I have to admit that we have a Sony 300MHz Pentium II running Windows at this time, however, the computers that do the real work(read anything other than Quicken) are a Sparcstation 20, and an UltraSparc 2. I am typing this on a Toshiba Tecra740CDT, running Solaris x86 (being used as an X-terminal), with Netscape running on the Ultra, and going thru ipfilter as NAT on the Cable modem. If and When Quicken comes out on Solaris or Linux, will be day that I will say good-bye to ALL MS products. I wish that day was here today.
    BTW, I did load Windows '98 on this machine ( the Toshiba that is ) last week , and I was so disgusted with it's egocentric behaviour, I took it out on Tuesday and loaded Solaris 7 x86 yesterday. Never again......

    Sinan
  • So what? We're grateful that they built the PC market. Now they can get the hell out of the way for a really good OS.

    At the time MS-DOS was the best. Its recent iterations are not the best, not even close. Their time over, and I think they've been more than compensated for their efforts.

  • My god slashdot is full of knee jerk flamers. You people are all nuts! All you who started out their computing experiences with ENIAC and used to warm their buttocks with vaccum tubes and therefore just know that MS is satan spawn can't even comprehend a simple post. The above poster meant that MS is partially responsible for extending the popularity of computer to the masses and thus providing an economic incentive for the rapid increases in computing power that we all adore.

    Even if he did mean that MS was responsible for creating the PC who gives a shit?
  • May I point you over to the newsgroup specifically for flaming?
    You know, guys, I think the supposed-Microserfs we get here are really just spambots or something. Trying to build mailing lists.
    On a side note, I'm beginning to seriously dislike the large number of Anonymous Coward posts. I wonder if there's a better way, that maintains privacy. Okay, that's enough rambling.
  • My first computer was a Timex Sinclair 1000.

    This beast of a super computer had 1k of RAM, and you had to code all your programs yourself. It basicly was a basic interpreter. Hell, it had basic commands mapped out to the keys to make programming easier.

    Soposedly, there was a RAM add-on, and programs it could read from cassette tapes(it had ear and mic jacks), but I never have seen either.
  • Someone once told me that a Cray was a multi-million dollar memory system with a CPU thrown in for free. The advantage of the Cray was that it could deal with very large datasets at high speed. It didn't have cache or virtual memory to slow things down.

    Your typical PC or workstation runs like sludge when you are dealing with huge datasets that blow out the cache.

    Take a look at http://www.cs.virginia.edu/stream/standard/Bandwid th.html for some memory bandwidth benchmarks that illustrate the huge difference between a PC and a "real computer".

  • [Your question is off topic, but anyway.]


    First of all, running a commercial web server on an overclocked system is probably the most stupid thing you could do.

    The primary objective of a server - any server - is to run _long_ (for months without turning it off) and to run _stable_ (for months without needing to reboot).

    Overclocked systems however are the exact opposite. People who use overclocked systems willingly sacrifice stability and reduce their CPU's lifetime to experience a bit more speed, usually for gaming.


    But anyway, any old 486 running Linux can do fine as a web server - it depends on what kind of web server you need.

    If your site only serves static web pages, even that old 486 can serve a small commercial site and probably won't have too much system load.

    But modern web sites usually serve dynamic content. If not used wisely, CGI scripts, database queries, PHP pages, Java servlets and similar things need quite a bit of computing power and can hog a server.

    Still, I haven't seen many commercial servers that couldn't be run by a machine with computing power similar to a Pentium/133. Of course, a totally different story is a web server that is extremely popular, such as online publications, search engines or slashdot. But how many commercial servers are that popular and have that many hits per minute, anyway?

    Just like with anything else in computing, it's not just the CPU you put into your machine. E.g., the more RAM for your web server, the better.

    Greetings,

    Hanno

  • And it runs on AA batteries too!
  • MS isn't evil, they are a necessity of modern computing, they give us a very good example of what NOT to do.

    Things of "old" computers that did matter: The Apple ][ actually got me started in this industry, then I got my Amiga and it STILL multitasks better than my Windoze PC (and it hasn't even got a hard drive!).

    As for Evil and Good, that's all opinion, I think that M$ makes a cruddy OS and Linux is a nice solution, personally though BeOS looks better and is MUCH easier to install! Game support is starting to creap into the alternate OS Scene... that means soon I can NUKE the space that now occupies the BSOD Generator and get on with some REAL gaming!
  • #ABSOLOUTE ROLLOCKS!

    Are you *TRYING* to start a flame war?

    Go $: cd /home !
  • Mwaaahahahaha, I nearly sh*t my pants reading that. You make HORRIBLE jokes.
  • Neither MS, nor Apple, nor even Xerox invented the WIMP interface we know and love. Overlapping windows were thought up in the '40s, the mouse in the '50s, and WYSIWYG in the '60s, before PARC existed.
  • Since you know so much, you're going to tell me when these ideas really were first published, aren't you? Didn't think so.
  • In my original post to Slashdot, boldly titled I Got News for All of You, I made the following rash, unsubstantiated claim: Overlapping windows were thought up in the '40s, the mouse in the '50s, and WYSIWYG in the '60s, before PARC existed. A clever Anonymous Coward noted that I was a dumbass and provided no references to back up my statements. Some might argue that merely saying, "You didn't document your sources so what you say is shit!" fails to constitute stimulating intellecutal discourse. It's nothing more than small-minded heckling. Some might even suggest that you can provide a counter proposition of your own, and if you then "up the ante" and back your own position with documented sources, you've pretty effectively proven your point and made your opponent look like a hothead besides. I would like to thank my anonymous benefactor for not doing that to me, because I made several mistakes. Then again, within the context of the discussion, I believe the A.C. was implicitly defending the position that the whole WIMP (Windows, Icons, Menus, Pointers; a shorthand for describing the essential ingredients of a modern GUI) shebang was invented at Xerox PARC, which would be even more wrong than I was. My primary source of information is the book (please forgive me) Insanely Great: The Life and Times of Macintosh, the Computer that Changed Everything, by Steven Levy. Sure, it's about the Mac, but really, how can you have any kind of meaningful discussion of GUI based computing without mentioning the Mac? Yes, I was wrong. It was not multiple windows that were invented in the 1940s, it was information surfing. Vannevar Bush, in his July 1945 Atlantic Monthly article As We May Think describes the sort of ad-hoc, stream-of-consciousness, associative method that characterizes the way we access information on the Web. Bush envisioned a work station with multiple screens, not multiple windows. I was also wrong about the mouse being invented in the 1950s. Douglas Englebart didn't invent the mouse until the mid 1960s, when he was at SRI. Here's an interesting Smithsonian Institution interview with Douglas Englebart. Sometime after 1966, Alan Kay at the University of Utah (later to join PARC) designed a "personal" computer called Flex that featured high-resolution graphics, icons and multiple windows. However, Kay himself admits (in Insanely Great) its interface was "repellent to users." Kay went on to work on the Alto and Macintosh. In his own words, Jeff Raskin developed an idea for a graphical, multi-font WYSIWYG computer interface based on a bitmapped display in the mid-1960s, which is described in his 1967 Penn State thesis, A Hardware-Independent Computer Drawing System Using List-Structured Modeling: The Quick-Draw Graphics System. I couldn't find a link to the thesis itself, but it is referenced in the database of the Software Patent Institute Raskin started the Macintosh project at Apple. Xerox PARC was founded in the year 1970. According to Levy, the Alto prototype was built at the end of 1972. Here's a nice article about the Alto. Here is another interesting site with a number of links to articles on History of Computing So, in the end, I was wrong about multiple windows, wrong about the mouse, right about WSIWYG, and right about all of these existing before the creation of PARC. I apologize for not checking my facts before posting. Finally, to my "small-minded heckler", thank you.
  • In my original post to Slashdot [slashdot.org], boldly titled I Got News for All of You, I made the following rash, unsubstantiated claim:

    Overlapping windows were thought up in the '40s, the mouse in the '50s, and WYSIWYG in the '60s, before PARC existed.

    A clever Anonymous Coward noted that I was a dumbass and provided no references to back up my statements. Some might argue that merely saying, "You didn't document your sources so what you say is shit!" fails to constitute stimulating intellecutal discourse. It's nothing more than small-minded heckling.

    Some might even suggest that you can provide a counter proposition of your own, and if you then "up the ante" and back your own position with documented sources, you've pretty effectively proven your point and made your opponent look like a hothead besides.

    I would like to thank my anonymous benefactor for not doing that to me, because I made several mistakes. Then again, within the context of the discussion, I believe the A.C. was implicitly defending the position that the whole WIMP (Windows, Icons, Menus, Pointers; a shorthand for describing the essential ingredients of a modern GUI) shebang was invented at Xerox PARC, which would be even more wrong than I was.

    My primary source of information is the book (please forgive me) Insanely Great: The Life and Times of Macintosh, the Computer that Changed Everything, by Steven Levy. Sure, it's about the Mac, but really, how can you have any kind of meaningful discussion of GUI based computing without mentioning the Mac?

    Yes, I was wrong. It was not multiple windows that were invented in the 1940s, it was information surfing. Vannevar Bush, in his July 1945 Atlantic Monthly article As We May Think [theatlantic.com] describes the sort of ad-hoc, stream-of-consciousness, associative method that characterizes the way we access information on the Web. Bush envisioned a work station with multiple screens, not multiple windows.

    I was also wrong about the mouse being invented in the 1950s. Douglas Englebart [mit.edu] didn't invent the mouse until the mid 1960s, when he was at SRI. Here's an interesting Smithsonian Institution interview [si.edu] with Douglas Englebart.

    Sometime after 1966, Alan Kay at the University of Utah (later to join PARC) designed a "personal" computer called Flex that featured high-resolution graphics, icons and multiple windows. However, Kay himself admits (in Insanely Great) its interface was "repellent to users." Kay went on to work on the Alto and Macintosh.

    In his own words [starway.org], Jeff Raskin developed an idea for a graphical, multi-font WYSIWYG computer interface based on a bitmapped display in the mid-1960s, which is described in his 1967 Penn State thesis, A Hardware-Independent Computer Drawing System Using List-Structured Modeling: The Quick-Draw Graphics System. I couldn't find a link to the thesis itself, but it is referenced in the database of the Software Patent Institute [spi.org] Raskin started the Macintosh project at Apple.

    Xerox PARC was founded [xerox.com] in the year 1970. According to Levy, the Alto prototype was built at the end of 1972. Here's a nice A HREF="http://www.research.microsoft.com/users/blam pson/38-AltoSoftware/WebPage.html">artic le about the Alto.

    Here is another interesting site with a number of links to articles on History of Computing [mediahistory.com]

    So, in the end, I was wrong about multiple windows, wrong about the mouse, right about WSIWYG, and right about all of these existing before the creation of PARC. I apologize for not checking my facts before posting.

    Finally, to my "small-minded heckler", thank you.

  • Isaac Asimov wrote more than one story about a "MultiVac" type computer that controlled society. The idea was that if you modelled the social and psychological behavior of every person on earth, on the computer, the computer would predict events like murders up to a week before it actually happened. This would allow for the dispatchment of police to keep the to-be murderer in custody until well after the date of the crime he was to commmit.

    On an interesting side note, the story I have in mind ended with the computer trying to kill itself. Knowing everything kind of sucks.
  • I have a C compiler that I can run on my HP 95LX, I think its called p(t)gcc (puny or tiny) and is somewhat free. Its not entirely ANSI C though =(
  • remember gbbs? ACOS/MACOS and METAL were very nice. remember METAL? i spent 4 years rewritting gbbs. then i got a mac lc. then a 7100/80 which is running mklinux right now. i still have a //e w/ a 9MHz prototype Zip Chip, a 10meg Xebec hd, and a 20meg corvus drive. i also have MAD other apple // stuff. i never did get a GS. wish i had. i used to had mad pse/ascii art skillz. did bbs ads, etc. wanna know something funny? untill a few years ago, the pool switch (i think that's what it did) at ABC news in DC was an apple ][+ running a custom program. i know this cuz my dad wrote that program and used to work there untill last year. it also had midnight commander pinball. i think that was the name of the pinball game. i still think that that was the best pinball games ever. someday (when i get a cable modem) i'll plug into my //e's ssc serial port and have a telnetable ACOS/MACOS gbbs-like bbs. after 4 years of coding, that damn bbs is gonna do something. back in the day, i didn't have a second line
    and only one //e, so i couldn't run the bbs.
  • I have an old Casio Zoomer based on an 8088 compatible NEC processor (4Mhz! WHEW!) It is roughly the size of the original Apple Newton and hit the market in 1992-1993. It uses the GEOS "operating system" (PC flavor of the old GUI for the C=64). It actually runs on a version of DOS and is only and GEOS is only an operating system in the sense of windows 3.X or Win9X. For pics, check out www.grot.com
  • My question is how much power does a server need now? Let's say a web server... many of those Sun servers that are in IT room closets are far slower than anything I'd realistically use today, but they did pretty damn well.

    What do you think? Celeron 300a overclocked to 450 running Linux as a web server?
  • I use Linux at work, for FTP and DNS. I use NT for webserving (not my choice).
    I put RH Linux on a partition of my harddrive at home (shared w/ win98) but I never really use it.

    I love Linux at work. The box sits in the corner and never gives me any grief. No complaints, no problems. Just work work work all day long.

    NT I have to re-boot and generally be more "hands-on"

    But at home, I wanna do two things: Play games (quake II) and watch the porn I download from newsgroups. Linux doesn't support many games, and though it runs quakeII, not with GL support. (not w/ my card anyway) and Xanim doesn't support 1/2 the codecs that windows media player does. Also, I can't seem to get any ICQ programs to work. So I have to re-boot Win98 once or twice a day? I don't work on it, I PLAY on it.

    Linux is a great OS. But it's gonna be AT LEAST a year, probably more, until it becomes a decent RECREATIONAL machine as well as the solid workhorse it already is.

    And let's face it, Microsoft has brought computing to the masses. If it wasn't for the fact that hundreds of thousands of people bought their first computer this year, and over 1/2 of american homes now have one, how many of us would have jobs? I wouldn't. A website needs an audience.

    My first computer was a TRS-80 model II. It had 4K of RAM and a regular cassette deck to load the programs in OFF A CASSETTE TAPE! we had to UPGRADE to 16K (K!!) to get a floppy drive. That was 1980.
    MS-DOS courtesy of Bill Gates. BASIC language built in. If I hadn't burned programing concepts into my little 10-year old head back in 1980 with BASIC, would I be as good w/ computers today? As EMPLOYABLE?

    Bill Gates was a geek in the basement once, too, ya know. Before he turned to the dark side...

    :-)

    -geekd
  • They're on a special "wither-up-and-blow-away" diet. :-)
  • by Wee ( 17189 )
    Yeah, AMD might suck, but they're hella cheap and you get pretty decent performance. And using an AMD CPU also gives me a nice warm fuzzy: my K6-2/333 at home uses no Intel hardware and the hard drive contains zero MS software.

    I run Linux with AMD, Via, Samsung, S3, and 3Dfx chips. There may be an Intel something lurking somewhere, but it's in the minority.

    I was a bit of work trying to pull that off, but if you look around, you can find a decent setup (esp. for home use) that avoids both monopolies. If any CPU can be said to "go with Linux", it's an AMD, and my system runs very well. And cost me very little money.

    It's very refreshing being able to choose what I want to run. At least we can thank MS and Intel for that.

    -B

  • At work we've got a couple Sun Ultra 5's (233 Mhz w/ 256M RAM). Each of the Suns serve about 125k+ dynamic pages a day; running 8 copies of NS Enterprise, Informix, and Cold Fusion and they've still got room to grow. They're not as rock solid as I'd like (have to reboot about every 2 weeks) but alot better then our NT boxes(2-3 times a week).

    We just gathered together some Linux boxes last week; P200s, and did some UNIX benchmarks on them and the suns. On overall system performance, the Suns only came out slightly better, but on fpu, they were about double the Linux. But what do you expect when you compare new high end systems to an old box. You buy a decent PII or AMD, and put 256M of RAM and a fast SCSI drive in it, you'll be hard pressed to find enough traffic to max it out.

    Jeremy Phillips
  • Why are these questions of your "better"? I think the original question is quite interesting. It is difficult to answer accurately, but thoughtful attempts to answer it could be quite interesting.

    Unfortunatley, no one seems inclined to even try to answer it. Rather than rising to the occasion, most of the posters to this topic are nitpicking.

    So rather than nitpicking, I am going to try and start an attempt to answer the question. I don't have information readily available to me right now, so I will start by trying to better frame the question by defining some simplifying assumptions.

    1. For the purpose of coming up with an aggregate value for the "total computing power" available in the world over a given time frame we should assume that the measurement for power scales linearly across a the number of systems we are measuring number of processors.

    2. For the purpose of simplifying the "power" of any single system, we should assume that any two systems whose performance differs by no more than two fold have equivalent performance. This allows us to group microprocessor based systems of different architectures into similar performance categories based on processor generation. Unfortunatly, I don't know quite how to apply this to vector supercomputers and mainframe systems. Hopefully someone else can. This assumption introduces inaccuracy, but this inaacuracy only results in an error of less than two years or so.
  • Given that, for good or evil, M$ had a hand in spreading PCs and desktop computing to the level it is today, the real question is what is M$'s future value?

    What can they do to advance/spread the future of computing? What will they do, for better or worse, that will influence how you use your computing power in the next 10 years? How long will M$ last? How will they need to restructure/change to adapt to the future? Keep in mind IBM and their corporate makeover(s), and what it took to change from a market leader to a market staple. It really isn't that difficult to imagine M$ getting pushed off the desktop in the near future; Apple's OSX/OS10 utilizes BSD and NeXTStep, both Intel compatible, and thus code wise a viable Windows replacement. Linux is making headway, and in the near future will become a viable desktop consumer OS. There also stands in the shadows BeOS, and even alternative chipsets/OSes as well, like Apple/PowerPC and Linux/Alpha, or Linux/PowerPC.

    Any comments?
    AS
  • Unfortunately, I don't have any useful numbers for people to play with/banter around. My response here is more about usefulness.

    In the dark ages before PC enlightenment -- i.e., machines the masses could afford to own themselves -- the mainframes and minicomputers were handling 40-100 terminals, batch jobs, etc. I remember a professor of mine complaining how the cost of CPU time could be quantitively valued in DOLLARS per MICROSECOND.

    A batch project run in 1980 using a high end 1977 IBM 370 (with all other users locked out) involving about 100 megabytes of data (cross referencing text, etc.) took about three days to run. Interestingly, the IS department who generally operated the machine expected the data run to take TWO WEEKS.

    I can compile a 10M C++ code library from scatch on my K6/266 equipped machine including all of the added error checking required for code compilation, linking, etc. in about 15 minutes. I would suspect that other PC developers have larger projects which have been compiled even faster.

    If all of the other mathematical elements were held constant, this would mean that my K6/266 would have taken about 2-1/2 hrs. to process the 1980 text run, about 40X faster than the high end 370.

    Because I built it (last year), my total cost of components not counting the monitor, which I already owned) was about $580.

    The system 370 cost about $3,000,000.

    Funny thing. The K6 is idle about 75% of the time now and will be retired later this year; the 370 wasn't fully retired until around 1991.

    My how things change...




  • I don't remember the name of the program, but there was a C compiler for the Apple II series that didn't require more than 64k (possibly 48k) to run. 1MB is huge!
  • He said "ballpark numbers".. Basically, it would be a very very rough estimate, but you could then have another bragging right. :-)
  • I literally learned to program on a C64 in '83. Everything I have learned since then I can trace back to my Commodore. It made it easy..

    The manual was all I needed to learn BASIC, and the reference manual (which I got from a friend and photocopied) taught me much everything I needed to learn assembler.

    Plus, there were the magazines.. Oh man, I remember when I was first learning, someone gave me a c64 mag.. It had the basic program to let you type in the assembler programs it had inside. long line of hex numbers to type in, unless you got the version with the disk.. but hey, i was a poor kid. :-) I typed those in 'til my hands hurt. had some great stuff in those.. Remember Speedscript? Great word processor for the C64. Had a bunch of addons I used too, like the one to let you display 80 columns on the screen for a print-preview. Very cool.

    Ahh, those were the days. :-)
  • we had a tandy 1000 pc in our house...still being used, until last year. now that is sad! and my grandmother still uses her apple II for accounting...she still believes it is the fastest thing on the market.....hey..really, we're not related!!!
  • I think in a way they do know the answer before the question is asked, with branch prediction and all that nice kinda stuff i don't really understand well enough to talk about.
  • I used to play Pong. I loved the original NES.
    The Commodore 64 always had the best games. Thanks to this article, I feel really old, but I too wonder about the same stuff. As for kurt, there's a poster about the growth of the internet that may be of some use to you. I'll see if I can
    track down the publishers and get you in touch with them.
  • If you're just serving out static Web pages, it doesn't matter what kind of machine you have, so long as it's reliable.

    I took my PowerPC 601/75mhz (LinuxPPC) with 16M of Ram and hit it with a few million httpd requests. Note that this computer is worth about $50 on the open market. It could saturate more than five T1s and serve out many millions of pages per day.

    The only place a Web Server would get bogged down is on CGIs and database accesses. To what extent it gets bogged down is entirely dependent on the CGI or database; the Web serving software is not a significant part of the equation.

    CGIs don't tend to be _that_ big of a deal, though. Really the rough spots are around database access. That's the only place where real money should be spent.

    Cartman
    twerges@hotmail.com
  • AMD sucks... why wld u want to buy cheap chips that dont perform nearly as well as intel? On the server side the XEON will kick any k6 straight across the ass. As will the pentium pro 200.

    i would never run my linux machines on an AMD chip.

  • yeah.. I have to agree with this... the only advantage to the pentium III thus far is the 3d enhancements... ooohhh... browse the web in 3d.... ahhhh.. play 3d like never before... hmmmm... ill just stick with my 2 K62-400's with TNT video cards... I think Ill be ok for a while....
  • Microsoft is responsible for you being able to use a computer today????? Wa Ha Ha Ha Ha Ho Ho He He. Yah right. People were using computers just fine before M$ came around. My first three computers had nothing to do with M$. And lets be honest here, if you are talking about a GUI. M$ did not even know what they were until a Mac prototype found its way to Redmond in late 1983, so that M$ could write software for it. Even today 15 years after the Mac was produced I still cannot enter all of the characters I want into file names on the NT machines we have here. And just because M$ publishes alot of the software out there does not mean that its good and we have to be satisfied with it. Most of it is bloated, with inefficient code, and is not the best way to go. NT 5 is apparently up around 30 million lines of code??? Lets say they only have a 5% bug rate on release. Thats around 1.5 million bugs folks. Oh excuse me issues. M$ programmers should have to learn how to program on the PalmOS before being able to program for Windows. The code there is small, tight, and efficient. There is no way we are getting another NT server for this lab. Perhaps a Qube, or if Apple can really make the UNIX like OSX work, I'll think about a Mac server.
  • by BWJones ( 18351 )
    Absolutely, this is why we would consider the Qube. This machine has Linux as the standard OS. For a server this would be a fine machine and it supports Windows and Appleshare networking as well. Right now we have NT running on a DEC alpha, and this is starting to show its age. So the system upgrade would either be a Mac running OSX or a Qube running Linux. The advantage of the Mac would be I could run other applications as well, such as Photoshop etc...
  • This is a list of the top 500 supercomputers in the world. I don't know where you'd find a list of the progression over the years though.

    http://www.netlib.org/benchmark/top500/top500.li st.html
  • It's hard to imagine how powerful computers are going to become...but it would be neat to compare my Pentium with ENIAC for instance. Obviously we're tying somewhere in the 1960's with PC vs. legacy supercomputers.
  • Cray 1, which was sold for $7M USD around 1977 had a peak performance of 133 megaflops, as Cray site (until recent integration with SGI site) claimed. Their 1985 machine (don't remember the name though) had 512MB of RAM and 16 giga flops top performance. Their 1993 machine overcame the tera flop limit. To compare, my AMD-K6 200MHz has reached peak performance of around 40 mega flops ... or maybe it was less - don't know if I understood the values right.
  • Excuse me? Least we forget that Paul Allen PORTED Basic from the mainframe to the Altair (and probably took this code verbatim and sold it to CBM). And DOS was QF86-DOS (or something like that) that Bill BOUGHT from someone else and hus gaggle of hacker friends developed. Windows has code which is LICENCED from Apple but one wonders why Bill worked on OS/2 before splitting all ties from IBM. Bill Gates is a shrewd buisness man and otherwise is an incompetitant MORON. Go read his responses in direct testimony at the USDOJ's site [usdoj.gov] and see how dumb this moron is.
  • my first computer memories are of when i was 4 years old and my dad sat me down in front of his TRS-80 Model 4....the rest is history....
  • "What do you think? Celeron 300a overclocked to 450 running Linux as a web server?"
    ---
    I've got exactly that running and it works great. I don't get a huge amount of hits - around 18k/day but half of those are sql queries on a 250k entry table. That's the only hit I notice - and those are pretty brief. Other than that it doesn't phase it whatsoever. And stability hasn't been a problem either (I guess I'm another one of the lucky ones) - a month of uptime with rc5 and no extra cooling is standard.
  • you can start with a whole lot of mips on this top 500 of supercomputers site. www.top500.org

    How would the #1 compare to an average PC ?

    #1 Intel ASCI Red (9152 processors)
    #2 SGI T3E1200 (1084 processors)
    #3 SGI T3E900 (1324 processors)

    and don't forget the those Beowulfen !

Understanding is always the understanding of a smaller problem in relation to a bigger problem. -- P.D. Ouspensky

Working...