Ask Slashdot: How Powerful is Your Computer? 237
Kurt submitted this interesting question:
"All of us have, at one time or another,
played the 'I remember when' game. For me
it was 'I remember when the first accoustic
coupler modems came out for the Apple II'.
My dad remembers when programming
meant soldering wires. I found myself recently
sitting in front of my K6-233 wondering
things like: 'At what time was the entire
world's computing power equal to my 233?'
and 'How soon will I be able to buy a machine
more powerful than the world's computing power
when I was born?' What I would like to see is
a graph of the entire worlds (desktop-mainframe)
computing power (machines times MIPS
[or other power rating]) - preferably with some
"giants of the day" plotted as a reference.
Does anyone know of such a thing? I'm just looking
for ballpark numbers, here."
Hehe (Score:1)
Early computers (Score:1)
Not gonna happen (Score:1)
Last year I was playing a game of Trival Pursuit.
I answered a question before It was read. I got the question right. The answer was 5.
It is possible to do this for any question. Just the odds are extremely small.
Its the 1000 monkeys at a 1000 typewriter type scenario.
So it is possible for a computer to answer a question before it is asked, and get it correct...everytime.
Check out the new Ray Kurzweil book (Score:1)
Kurzweil's newest, The Age of Spiritual Machines, includes comparison tables for things like the total computer power available in 1950 and at present, plus some wacky predictions
MAC vs PC: processor arm-wrestling (Score:1)
Re: (Score:1)
Log live TRS-80!!!! (Young Pup) (Score:1)
Weanies one and all. My first true home computer(as opposed to the many development systems that my Dad brought home from work) was a TRS-80 Model I...not colour nothing... sheesh.
My first computer memory was of playing with a teletype when I was 3...which was only 22 years ago... so perhaps I'm a weanie too.
For the uninitiated a teletype was what we used before VDUs....(printer + keyboard + paper tape reader/writer) I only remember the fast 300 baud ones... 75baud and lower was before my time...
Power, more power (Score:1)
you can find it here [delta.is], if you are interested
Jón
Computer History 101 (Score:1)
Computer History 101 (Score:1)
Woopse that was dumb! corrected link below :(
Here is a link to that book on Amazon Computer: A History of the Information Machine [amazon.com].
It doesn't spend much time past the "PC Revolution," read, no windows 95 or Linux.
Computer History 101 (Score:1)
Woopse that was dumb! corrected link below :(
Here is a link to that book on Amazon Computer: A History of the Information Machine [amazon.com].
It doesn't spend much time past the "PC Revolution," read, no windows 95 or Linux.
M$ 4ever (Score:1)
mind is not algorithmic (Score:1)
Power to serve (Score:1)
When I came to the this company I was a die-hard Linux geek. Now I see that FreeBSD makes a better production server. I still use SuSE for my desktop as it configs so easy. I don't know if you can get the above numbers with Linux. If you have, please let me know.
More basic Q - Travelling Salesman? (Score:1)
What's the Travelling Salesman problem?
dylan_-
--
NOTHING was slower them my old computer... (Score:1)
Wasn't that the same as a ZX81 over here (UK)? If so, there was a 16K RAM pack that you could slot in, though it tended to wobble a bit, so you should really tape it in. Then you could play games like Jet Set Willy, and I think Hungry Horace was a 16K game. Games used to say on the tape cover whether they were 16K or 48K (I have a ZX Spectrum 48K)....I remember when you could walk into John Menzies or Woolworths and see the shelves filled with games for Spectrum, Commodore64 and Amstrad.
If Slashdot had been around then, you would have had ACs posting "Commie64 sux" followed up with "Go play with your rubber keyed beer mat while we use a *real* computer"...the only thing everyone agreed on was that Amstrad was crap :-)
dylan_-
--
NOTHING was slower them my old computer... (Score:1)
Oops....well, I never had a ZX81....I thought with the 16K pack it could run the same games the Speccy could....well, you live and learn
dylan_-
--
I agree (Score:1)
Umm, that suggests that Linux was actively competing with Windows for users. When Win3.1 came out I think Linux had what, a thousand users? Even until the last few years Linux wasn't even a blip on Microsoft's radar. Linux never lost the battle..it hasn't even started yet.
Daniel
Don't bother with the overclock. (Score:1)
I wouldn't overclock a server, not even with one of those rock-solid 450a jobs. Overclocking is for gaming (and that's what I use my celeron for
Not gonna happen... (Score:1)
Frankly, I still don't know what consciousness has got to do with quantum mechanics. AFAIK there is no evidence for quantum effects being relevant to the performance in the human brain.
an 8088 palm top? - how could it be palm sized.. (Score:1)
http://www.rim.net
Complete with wireless networking.
Win95 = Amiga90 (Score:1)
Yes, M$ made a big leap from CP/M, er, PC-DOS, er, MS-DOS to Win95, but when you're starting from that far in the hole, even someone else's 5-year-old technology is going to look good.
Oh, please... (Score:1)
Let's talk about the old computers, shall we? Did they have Windows? No. But better interfaces existed before than, most notably in MacOS. Did they have Word? No, but before Word came along other graphical word processors existed. I might add something here: the first version of Word in the form we know it today was written for the Mac.
Get your history right next time you shoot your mouth off.
Mixed metaphors. (Score:1)
Xerox did develop the GUI (along with Smalltalk, ethernet, the mouse and a number of other things) at their Palo Alto Research Centre (PARC). It was a lot earlier than 1980, though. Actually, around 1974, I believe.
And it was called the Alto.
Although they did in 1981 release the Star, which was a commercial version of the Alto (which wasn't much more than a prototype), it never really sold much.
But you were heading in the right direction
- Sean
- SeanNi
Power, more power (Score:1)
Heres something I found (Score:1)
Computer Organization & Design,Hennessy & Patterson
http://www.cis.ohio-state.edu/~mahrt/table.html
I bought an Alpha 533 this morning. (Score:1)
I hope I can find memory for the bastard.
--
Where? (Score:1)
It's only an SX, and they told me I got one of only 2 remaining. They have LX's and up as well, but for a little more money.
--
Apple ][ Manuals (Score:1)
My two cents.
Computer manuals have gone WAY downhill... (Score:1)
an old TRS-80 Color Computer (model 1 if you could say) which was $5 from a yard sale.
The computer came with nothing but a BASIC manual and another "user's manual". It taught me everything I never
wanted to know about how the machine's ports work. Moreover, the BASIC manual was excellent.
In the back it even explained how to do lowlevel graphics programming (since that's
all you could do, well, not really since that computer had Extended Color BASIC instead of just Color BASIC, but anyway)
with the machine, plus a few small pointers on assembly language (although I never did get into ASM, and never have yet)
Too much time on our hands.. (Score:1)
oh duh!@#! It's all about the anon. insults! (Score:1)
HP calculators (Score:1)
Historical Power (Score:1)
Computer manuals have gone WAY downhill... (Score:1)
One could literally teach oneself BASIC, Assembly and all kinds of programming tricks with those things.
If documentation of that quality were available for today's PC's, I would cry for joy.
Kythe
(Remove "x"'s from
M$ 4ever (Score:1)
Apple, Power Computing, IBM and Motorola are largely responsible for the computer I use today. Microsoft is responsible for little more than ensuring that I have trouble finding software to run on it.
Microsoft hasn't done a damn thing for the computer industry, other than keep the hopelessly archaic X86 platform afloat. Yay?
- Darchmare
- Axis Mutatis, http://www.axismutatis.net
Idiot! MS has _never_ invented anything (Score:1)
actually, i think that the proper quote would be ``Windows 95, brought to you by the creators of edlin.''
ever had to use edlin for anything serious? not good for your health.
Great Leap? (Score:1)
Answer before question? (Score:1)
Webservers (Score:1)
Management was ready to buy new servers this year for our offices (we have a Pentium Pro 180).
Trying to be the "contious employee" I told them they barely get any utilization as it is (primarily file/mail/print servers), if anything they should buy some more storage and maybe ram.
Federico
Old != impotent (Score:1)
1) Lies
2) Damned Lies
3) Statistics
4) Benchmarks
5) MicroSoft Product (schedule) Announcements
Add to this a 6th:
6) Presidential Testimony.
(As for the last- Some of the Democratic Senators wanted Pres. Clinton to testify in front of the Senate but it took over an hour for the Republican Senators to stop laughing and explain to them that they had no way to swear him in.)
-soup
(No man can learn about impotence the hard way.)
I agree -- NOT! (Score:1)
This is not to say that M$ is evil (though they may be) or that linux is good (though it is), but really -- after 10+ years of development, you'd kind of *expect* some leaps, wouldn't you?
Just a guess (Score:1)
Now these computers were extremely slow, but the shear number of them would would make up for this deficiency.
Another question would be, "When was the time when the fastest commercially available computer had the same processing power as a 233Mz Pentinum?" The answer to that one would be, around the late 60's (I think). BTW, the Cray I (which appeared around 1977???) had a "speed" of about 150 mega-flops.
hp computers (Score:1)
What HP really excels at is printers, calculators and test equipment. Their servers are a pain to deal with at best beyond, "ohh, a problem, let's reboot."
And then what? (Score:1)
And then things get interesting.
AI so far has completely flopped. Well so what? What is the computational power and memory of a brain? Of a computer? Some problems are only addressable in hardware, and I honestly believe AI to be one of them. So when the AI folks finally have the necessary computational power, I think that AI will prove to be like chess-playing, like speech recognition, like lots of problems. Quite solvable with the right equipment.
But then we have a very nasty situation. For $70,000/year you can hire this human, or for $1000 you can buy a computer capable of the same job, but it works 24 hours a day. Do you want to hire or buy?
This dilemma has arisen in the past and the human almost always loses. However historically humans are more flexible and so there have always been plenty of other things that talking monkeys can easily do and machines can't yet. So with the machines handling the repetitive stuff, and the monkeys doing more interesting stuff, we all wind up generally better.
But what do the monkeys do when there is *nothing* that the machine can't do better? Sure at that point it is theoretically possible for us all to just be provided with the necessities of life - but does anything in the history of our economic system indicate that that will happen? Not that I can see!
I am not a techno-phobe, but I can tell you that I am preparing to have enough saved that I won't economically need to work after that point...
Ben Tilly
M$ 4ever (Score:1)
Raise a glass to:
The various incarnations of (now) caldera's OpenDOS, OS/2, MacOS, AmigaOS, etc.
All the third party vendors driven under by Microsoft bundling.
Thank the folks who wrote Winsock.
Thank the original author of QDOS, too.
And, of course, CP/M
These are the people microsoft have copied, and the reason why MSDOS 1.0 isn't still the default OS installed on the PC's of the world.
Do it yourself... (Score:1)
matguy
Net. Admin.
Some Perspective... (Score:1)
Within a year we introduced the uVAX-II/GPX at roughly the same horsepower level and the world's first commercial release of the MIT X Window System. This was a machine that topped out at 9MB RAM and was designed for a single user...
In those days a UNIX kernel weighed in at about 300kb (we didn't compress kernels in those days) and you could actually install a fully functional UNIX system in under 27MB of disk space with plenty of room to spare.
My current computer (k6/2-300, 4.5GB, 128MB RAM) was pretty close to inconceivable as a home computer six years ago when I bought my first PC (486dx/33, 340MB, 8MB RAM).
All that being said, unless we start to go nutty with voice processing or mondo 3d, I think the current generation hardware is a mature plateau for the desktop platform.
Oh, please... (Score:1)
It started shipping with the commodore 128, I think. It's too bad commodore got the commercial rights for Amiga in the states, much better design than today's computers...
dreaming of the future (Score:1)
whell thats a thought ive had a lot, once i understand how to program for pvm ill take all my computers and all my roomates computers and link them
A better question would be (Score:1)
Then... In what year will the power to simulate a human brain be contained in a single supercomputer?
Then... In what year will the power to simulate a human brain be contained in a computer the size of a human head?
I read a book by Tippler that was really good and he talked about it. I think his conclusion was about 2060 to get the simulated brain in a desktop computer. He's probably way off, because projecting the future is difficult, but it's still interesting to think about.
More basic Q (Score:1)
I don't know a single person who has the ability to do it either.
Cray 1 (Score:1)
My Pentium 133 does 21 MFLOPS according to benchmarks.
All you people with 450 MHZ processors must be somewhere in the ballpark of a Cray I.
True Motivation of Home Machines! (Score:1)
M$ 4ever (Score:1)
was an SDS-920 (paper-tape reader , paper-tape punch), and then an SDS-930 which had a PRINTER
(ooops, I meant a printer, but it was loUD...)
The came PDP-11 and HP2107. And they actually
had some decent debuggers. Then I had a Compaq
386 with MS windows something or another on top
of DOS 3.31 (it was a Compaq remember?). I was
so visually shaken by the inferiority of the thing, I got rid of it after six months. Then came Windows 2.0 , at which time I finally decided to spring $5500 for a refurbished SparcStation 1, in
1991. 1992, we added a Sparcstation 1+, and 1993
we got a Mac 840AV. I have to admit that we have a Sony 300MHz Pentium II running Windows at this time, however, the computers that do the real work(read anything other than Quicken) are a Sparcstation 20, and an UltraSparc 2. I am typing this on a Toshiba Tecra740CDT, running Solaris x86 (being used as an X-terminal), with Netscape running on the Ultra, and going thru ipfilter as NAT on the Cable modem. If and When Quicken comes out on Solaris or Linux, will be day that I will say good-bye to ALL MS products. I wish that day was here today.
BTW, I did load Windows '98 on this machine ( the Toshiba that is ) last week , and I was so disgusted with it's egocentric behaviour, I took it out on Tuesday and loaded Solaris 7 x86 yesterday. Never again......
Sinan
M$ 4ever (Score:1)
At the time MS-DOS was the best. Its recent iterations are not the best, not even close. Their time over, and I think they've been more than compensated for their efforts.
People posting against MS are idiots (Score:1)
Even if he did mean that MS was responsible for creating the PC who gives a shit?
Fishin' with flame bait on /. (Score:1)
You know, guys, I think the supposed-Microserfs we get here are really just spambots or something. Trying to build mailing lists.
On a side note, I'm beginning to seriously dislike the large number of Anonymous Coward posts. I wonder if there's a better way, that maintains privacy. Okay, that's enough rambling.
NOTHING was slower them my old computer... (Score:1)
This beast of a super computer had 1k of RAM, and you had to code all your programs yourself. It basicly was a basic interpreter. Hell, it had basic commands mapped out to the keys to make programming easier.
Soposedly, there was a RAM add-on, and programs it could read from cassette tapes(it had ear and mic jacks), but I never have seen either.
Cray performance (Score:1)
Your typical PC or workstation runs like sludge when you are dealing with huge datasets that blow out the cache.
Take a look at http://www.cs.virginia.edu/stream/standard/Bandwi
Webservers (Score:1)
First of all, running a commercial web server on an overclocked system is probably the most stupid thing you could do.
The primary objective of a server - any server - is to run _long_ (for months without turning it off) and to run _stable_ (for months without needing to reboot).
Overclocked systems however are the exact opposite. People who use overclocked systems willingly sacrifice stability and reduce their CPU's lifetime to experience a bit more speed, usually for gaming.
But anyway, any old 486 running Linux can do fine as a web server - it depends on what kind of web server you need.
If your site only serves static web pages, even that old 486 can serve a small commercial site and probably won't have too much system load.
But modern web sites usually serve dynamic content. If not used wisely, CGI scripts, database queries, PHP pages, Java servlets and similar things need quite a bit of computing power and can hog a server.
Still, I haven't seen many commercial servers that couldn't be run by a machine with computing power similar to a Pentium/133. Of course, a totally different story is a web server that is extremely popular, such as online publications, search engines or slashdot. But how many commercial servers are that popular and have that many hits per minute, anyway?
Just like with anything else in computing, it's not just the CPU you put into your machine. E.g., the more RAM for your web server, the better.
Greetings,
Hanno
Performance vs Usefulness (Score:1)
M$ 4Never (Score:1)
Things of "old" computers that did matter: The Apple ][ actually got me started in this industry, then I got my Amiga and it STILL multitasks better than my Windoze PC (and it hasn't even got a hard drive!).
As for Evil and Good, that's all opinion, I think that M$ makes a cruddy OS and Linux is a nice solution, personally though BeOS looks better and is MUCH easier to install! Game support is starting to creap into the alternate OS Scene... that means soon I can NUKE the space that now occupies the BSOD Generator and get on with some REAL gaming!
M$ 4ever (Score:1)
Are you *TRYING* to start a flame war?
Go $: cd
M$ 4ever (Score:1)
I got news for all of you (Score:1)
You got news for me (Score:1)
I got news for all of you: CORRECTION (Score:1)
I got news for all of you: CORRECTION (Score:1)
In my original post to Slashdot [slashdot.org], boldly titled I Got News for All of You, I made the following rash, unsubstantiated claim:
A clever Anonymous Coward noted that I was a dumbass and provided no references to back up my statements. Some might argue that merely saying, "You didn't document your sources so what you say is shit!" fails to constitute stimulating intellecutal discourse. It's nothing more than small-minded heckling.
Some might even suggest that you can provide a counter proposition of your own, and if you then "up the ante" and back your own position with documented sources, you've pretty effectively proven your point and made your opponent look like a hothead besides.
I would like to thank my anonymous benefactor for not doing that to me, because I made several mistakes. Then again, within the context of the discussion, I believe the A.C. was implicitly defending the position that the whole WIMP (Windows, Icons, Menus, Pointers; a shorthand for describing the essential ingredients of a modern GUI) shebang was invented at Xerox PARC, which would be even more wrong than I was.
My primary source of information is the book (please forgive me) Insanely Great: The Life and Times of Macintosh, the Computer that Changed Everything, by Steven Levy. Sure, it's about the Mac, but really, how can you have any kind of meaningful discussion of GUI based computing without mentioning the Mac?
Yes, I was wrong. It was not multiple windows that were invented in the 1940s, it was information surfing. Vannevar Bush, in his July 1945 Atlantic Monthly article As We May Think [theatlantic.com] describes the sort of ad-hoc, stream-of-consciousness, associative method that characterizes the way we access information on the Web. Bush envisioned a work station with multiple screens, not multiple windows.
I was also wrong about the mouse being invented in the 1950s. Douglas Englebart [mit.edu] didn't invent the mouse until the mid 1960s, when he was at SRI. Here's an interesting Smithsonian Institution interview [si.edu] with Douglas Englebart.
Sometime after 1966, Alan Kay at the University of Utah (later to join PARC) designed a "personal" computer called Flex that featured high-resolution graphics, icons and multiple windows. However, Kay himself admits (in Insanely Great) its interface was "repellent to users." Kay went on to work on the Alto and Macintosh.
In his own words [starway.org], Jeff Raskin developed an idea for a graphical, multi-font WYSIWYG computer interface based on a bitmapped display in the mid-1960s, which is described in his 1967 Penn State thesis, A Hardware-Independent Computer Drawing System Using List-Structured Modeling: The Quick-Draw Graphics System. I couldn't find a link to the thesis itself, but it is referenced in the database of the Software Patent Institute [spi.org] Raskin started the Macintosh project at Apple.
Xerox PARC was founded [xerox.com] in the year 1970. According to Levy, the Alto prototype was built at the end of 1972. Here's a nice A HREF="http://www.research.microsoft.com/users/blam pson/38-AltoSoftware/WebPage.html">artic le about the Alto.
Here is another interesting site with a number of links to articles on History of Computing [mediahistory.com]
So, in the end, I was wrong about multiple windows, wrong about the mouse, right about WSIWYG, and right about all of these existing before the creation of PARC. I apologize for not checking my facts before posting.
Finally, to my "small-minded heckler", thank you.
Re: Guessing before it happens (Score:1)
On an interesting side note, the story I have in mind ended with the computer trying to kill itself. Knowing everything kind of sucks.
C on really small machines (Score:1)
Apple ][ forever! (Score:1)
and only one
an 8088 palm top? - how could it be palm sized.. (Score:1)
Power, more power (Score:1)
What do you think? Celeron 300a overclocked to 450 running Linux as a web server?
linux 4 work, Win98 4 play (Score:1)
I put RH Linux on a partition of my harddrive at home (shared w/ win98) but I never really use it.
I love Linux at work. The box sits in the corner and never gives me any grief. No complaints, no problems. Just work work work all day long.
NT I have to re-boot and generally be more "hands-on"
But at home, I wanna do two things: Play games (quake II) and watch the porn I download from newsgroups. Linux doesn't support many games, and though it runs quakeII, not with GL support. (not w/ my card anyway) and Xanim doesn't support 1/2 the codecs that windows media player does. Also, I can't seem to get any ICQ programs to work. So I have to re-boot Win98 once or twice a day? I don't work on it, I PLAY on it.
Linux is a great OS. But it's gonna be AT LEAST a year, probably more, until it becomes a decent RECREATIONAL machine as well as the solid workhorse it already is.
And let's face it, Microsoft has brought computing to the masses. If it wasn't for the fact that hundreds of thousands of people bought their first computer this year, and over 1/2 of american homes now have one, how many of us would have jobs? I wouldn't. A website needs an audience.
My first computer was a TRS-80 model II. It had 4K of RAM and a regular cassette deck to load the programs in OFF A CASSETTE TAPE! we had to UPGRADE to 16K (K!!) to get a floppy drive. That was 1980.
MS-DOS courtesy of Bill Gates. BASIC language built in. If I hadn't burned programing concepts into my little 10-year old head back in 1980 with BASIC, would I be as good w/ computers today? As EMPLOYABLE?
Bill Gates was a geek in the basement once, too, ya know. Before he turned to the dark side...
:-)
-geekd
Enough chatter, why you not find this? (Score:1)
Please do NOT feed the Trolls! (Score:1)
M$ 4ever (Score:1)
I run Linux with AMD, Via, Samsung, S3, and 3Dfx chips. There may be an Intel something lurking somewhere, but it's in the minority.
I was a bit of work trying to pull that off, but if you look around, you can find a decent setup (esp. for home use) that avoids both monopolies. If any CPU can be said to "go with Linux", it's an AMD, and my system runs very well. And cost me very little money.
It's very refreshing being able to choose what I want to run. At least we can thank MS and Intel for that.
-B
Power, more power (Score:1)
We just gathered together some Linux boxes last week; P200s, and did some UNIX benchmarks on them and the suns. On overall system performance, the Suns only came out slightly better, but on fpu, they were about double the Linux. But what do you expect when you compare new high end systems to an old box. You buy a decent PII or AMD, and put 256M of RAM and a fast SCSI drive in it, you'll be hard pressed to find enough traffic to max it out.
Jeremy Phillips
A better answer would be: (Score:1)
Unfortunatley, no one seems inclined to even try to answer it. Rather than rising to the occasion, most of the posters to this topic are nitpicking.
So rather than nitpicking, I am going to try and start an attempt to answer the question. I don't have information readily available to me right now, so I will start by trying to better frame the question by defining some simplifying assumptions.
1. For the purpose of coming up with an aggregate value for the "total computing power" available in the world over a given time frame we should assume that the measurement for power scales linearly across a the number of systems we are measuring number of processors.
2. For the purpose of simplifying the "power" of any single system, we should assume that any two systems whose performance differs by no more than two fold have equivalent performance. This allows us to group microprocessor based systems of different architectures into similar performance categories based on processor generation. Unfortunatly, I don't know quite how to apply this to vector supercomputers and mainframe systems. Hopefully someone else can. This assumption introduces inaccuracy, but this inaacuracy only results in an error of less than two years or so.
Nothing is 4ever dude: M$'s lifespan =) (Score:1)
What can they do to advance/spread the future of computing? What will they do, for better or worse, that will influence how you use your computing power in the next 10 years? How long will M$ last? How will they need to restructure/change to adapt to the future? Keep in mind IBM and their corporate makeover(s), and what it took to change from a market leader to a market staple. It really isn't that difficult to imagine M$ getting pushed off the desktop in the near future; Apple's OSX/OS10 utilizes BSD and NeXTStep, both Intel compatible, and thus code wise a viable Windows replacement. Linux is making headway, and in the near future will become a viable desktop consumer OS. There also stands in the shadows BeOS, and even alternative chipsets/OSes as well, like Apple/PowerPC and Linux/Alpha, or Linux/PowerPC.
Any comments?
AS
Computer Power (Score:1)
In the dark ages before PC enlightenment -- i.e., machines the masses could afford to own themselves -- the mainframes and minicomputers were handling 40-100 terminals, batch jobs, etc. I remember a professor of mine complaining how the cost of CPU time could be quantitively valued in DOLLARS per MICROSECOND.
A batch project run in 1980 using a high end 1977 IBM 370 (with all other users locked out) involving about 100 megabytes of data (cross referencing text, etc.) took about three days to run. Interestingly, the IS department who generally operated the machine expected the data run to take TWO WEEKS.
I can compile a 10M C++ code library from scatch on my K6/266 equipped machine including all of the added error checking required for code compilation, linking, etc. in about 15 minutes. I would suspect that other PC developers have larger projects which have been compiled even faster.
If all of the other mathematical elements were held constant, this would mean that my K6/266 would have taken about 2-1/2 hrs. to process the 1980 text run, about 40X faster than the high end 370.
Because I built it (last year), my total cost of components not counting the monitor, which I already owned) was about $580.
The system 370 cost about $3,000,000.
Funny thing. The K6 is idle about 75% of the time now and will be retired later this year; the 370 wasn't fully retired until around 1991.
My how things change...
Performance vs Usefulness (Score:1)
I think everyone is missing the point (Score:1)
C64 was possibly the best learning computer ever (Score:1)
The manual was all I needed to learn BASIC, and the reference manual (which I got from a friend and photocopied) taught me much everything I needed to learn assembler.
Plus, there were the magazines.. Oh man, I remember when I was first learning, someone gave me a c64 mag.. It had the basic program to let you type in the assembler programs it had inside. long line of hex numbers to type in, unless you got the version with the disk.. but hey, i was a poor kid.
Ahh, those were the days.
i remember when.... (Score:1)
Not gonna happen (Score:1)
I know the feeling. (Score:1)
The Commodore 64 always had the best games. Thanks to this article, I feel really old, but I too wonder about the same stuff. As for kurt, there's a poster about the growth of the internet that may be of some use to you. I'll see if I can
track down the publishers and get you in touch with them.
Power, more power (Score:1)
I took my PowerPC 601/75mhz (LinuxPPC) with 16M of Ram and hit it with a few million httpd requests. Note that this computer is worth about $50 on the open market. It could saturate more than five T1s and serve out many millions of pages per day.
The only place a Web Server would get bogged down is on CGIs and database accesses. To what extent it gets bogged down is entirely dependent on the CGI or database; the Web serving software is not a significant part of the equation.
CGIs don't tend to be _that_ big of a deal, though. Really the rough spots are around database access. That's the only place where real money should be spent.
Cartman
twerges@hotmail.com
M$ 4ever (Score:1)
i would never run my linux machines on an AMD chip.
Some Perspective... (Score:1)
M$ 4ever (Score:1)
Um. (Score:1)
Try this URL... (Score:1)
http://www.netlib.org/benchmark/top500/top500.l
How Powerful? (Score:1)
Cray performance (Score:1)
M$ 4ever (Score:1)
Log live TRS-80!!!! (Score:1)
Power, more power (Score:1)
---
I've got exactly that running and it works great. I don't get a huge amount of hits - around 18k/day but half of those are sql queries on a 250k entry table. That's the only hit I notice - and those are pretty brief. Other than that it doesn't phase it whatsoever. And stability hasn't been a problem either (I guess I'm another one of the lucky ones) - a month of uptime with rc5 and no extra cooling is standard.
supercomputer top 500 (Score:1)
How would the #1 compare to an average PC ?
#1 Intel ASCI Red (9152 processors)
#2 SGI T3E1200 (1084 processors)
#3 SGI T3E900 (1324 processors)
and don't forget the those Beowulfen !