Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Hardware Technology

What Applications Will Drive System Performance? 106

Foredecker asks: "Companies like AMD, Intel, NVIDIA, ATI and others are continuing to drive silicon performance to new levels. Of course, every day computing (basic web browsing, email, word processing, spreadsheets, personal finance, and the like) don't require a Intel 3.2Ghz P4 with Hyperthreading or a AMD Athlon 64 FX and their associated platforms. Of course, there are apps that will leverage today's high performance platforms. Games are an obvious category, as is video editing. I'm looking for apps that will be widely adopted and will drive volume hardware shipments. Things that come to mind are: effective, speaker independent voice recognition, accurate repeatable object recognition in digital photos and videos (or from live feeds such as web cams). What other application categories are there that will drive the need for bigger-faster-better hardware platforms?"
This discussion has been archived. No new comments can be posted.

What Applications Will Drive System Performance?

Comments Filter:
  • by Mr. Darl McBride ( 704524 ) on Friday December 26, 2003 @07:34PM (#7815005)
    Have no fear: High end systems have a long and brilliant future [princessfixit.com] as penis extensions for spreadsheet-jockeying corporate accountants. These tiny men will continue to demand 3.2GHz CPUs and ATI FireGL 9600s to make clippy the fastest little bugger possible. (In this case, we can only assume that he wants to give Clippy, his only friend, his own monitor to make Excel that much more of a magical happy friend land.)

    I mean, what the hell? This isn't the first time I've seen this kind of thing. Why are the guys who are hired to pinch pennies in corporate always the ones who aren't happy until their toilets flush with freshly imported springwater?

  • OS Bloat (Score:2, Interesting)

    by PeteyG ( 203921 )
    Anyone seen the system requirements for Longhorn/OSXI/KDE4? they're probably gargantuan.
    • by Glonoinha ( 587375 ) on Friday December 26, 2003 @09:08PM (#7815306) Journal
      Processing power becomes very important at the two extremes of the Hertz chart :
      Things measured in units per second (ie, frames per second, transactions per second, connections per second) will always benefit by faster performance on a faster machine.
      Things measured in many 10's of minutes (ie, an hour or more to process one transaction) will also benefit from a faster box. This would be cryptography, video compression codecs, and physics models.

      When the transaction time is more than 1 minute but less than 10 minutes you really do not gain anything by increasing the performance of the machine (unless you can increase it to the point it runs in less than a minute. If you compile code on a computer that takes 7 minutes to compile it, buying a new computer that is twice as fast still has you compiling for 4 minutes. No real difference between the two, really, from a user's perspective.

      When the transactions are measured in per second, the difference between 15fps and 30fps is the difference between unusable, and usable - particularly when we are talking about first person shooters. The difference between processing 150 visitors a second and 300 visitors a second is the difference between getting slashdotted and not.

      When the transactions are measured in hours, being able to double the performance makes the difference in whether or not a particular transaction is even possible. Nightly backups are not particularly effective if they take 28 hours to process. Nightly runs of an accounting system .. ditto. Decrypting a message saying that Pearl Harbot is going to get bombed in 12 hours doesn't do us much good if it takes 14 hours to decrypt.

      As long as we have applications that take more than an hour to run, and as long as we are measuring applications in X per second (frames per second in the range of 1 to 100, transactions per second of more than 1,000 and less than 50,000) - we will benefit by having faster computers.
      • by Glonoinha ( 587375 ) on Friday December 26, 2003 @09:19PM (#7815333) Journal
        Almost forgot.

        Any difference is performance that requires a stopwatch or a special timing demo application to measure - isn't a difference.

        183fps = 200fps in Quake.
        pc3200 RAM = pc2700 RAM = pc3500 RAM
        28fps in UT2003 = 30fps in UT2003
        specINT 93158 = specINT 96452

        Pentium4 3.06GHz = Pentium4 3.2GHz = Pentium4 2.8GHz.

        Until you are talking about performance games of roughly 300%, or one machine being 3x as fast as another machine - it isn't worth replacing the machine. It would be silly to replace a 486DX2-50 with a 486DX2-66, even though you would get a 30% boost in processor speed. You wouldn't replace a PII/300 with a PII/366 even though you would get a 20% boost in processing speed - they are effectively the same speed and you probably wouldn't notice the difference. Replace that PII/300 with a PIII/900 though, go 3x as fast, and all of a sudden you can see big differences and a major improvement. Same thing with replacing the PIII/900 with a P4/2.8GHz.
        • Until you are talking about performance games of roughly 300%, or one machine being 3x as fast as another machine - it isn't worth replacing the machine.

          That's only true if you are talking about games. If you're talking about video encoding, or simulation work, or other real number-crunching stuff, then an increase of 50% might see your run times cut from 24 hours to 18 hours. That may well be worth the money, depending on how often you're performing those runs.

          That said, you appear to be talking specifi
          • It looks like you have been upgrading about every 24 months to 30 months .. maybe 3 years on the far stretch between complete machine upgrades. I am guessing your current system is less than a year old, and suits your current needs real well (I have two 2.4GHz machines, one is only about 2 months old and I am happy with both of them.) Flash forward 24 months from now and Moore's law says that the pimp machines are going to be running 3.2GHz (current fastest machine) times 2 (doubles every 18 months) to th
        • You left out the equation that Intel Centrino1600 = AMD Duron550.
          As for "What Applications Will Drive System Performance?"...

          I'd say 2003 software suites for the latest notebooks requiring
          2005 mobile hardware to run like 2001 desktop systems!

      • Dear Glonoinha,

        You can convert between units per second and seconds per unit using this special formula:

        1 / unitspersecond = secondsperunit

        I hope this helps.
  • Windows? (Score:2, Funny)

    by Anonymous Coward
    It's quite the resource hog.
    • Re:Windows? (Score:3, Funny)

      by Anonymous Coward
      I see you haven't had the opportunity to experience Gnome....
      • by Anonymous Coward
        I can't run Gnome, which is why I mentioned Windows. I do not yet have a 6 ghz Colosseon (AMD chip) system with 4 tera ram and the 500 gig HD space required to run Gnome adecquately.
  • Visualization (Rendering)
    Distributed computing
    Power to scale (1u capable of what only a current 2u can do)
    Science
    Encryption (Quantum, brute force cracking, etc.)
    Databases (often overlooked)
    Doom 3 ;-)
  • by Anonymous Coward
    Porn will lead us.

    Massive amounts of storage, and fantastic amounts of cheap processing power will lead to a generation of smart capable web spiders capable of autonimously downloading and indexing porn while avoiding banners, advertisments, pop ups, P2P spam or crap-floods and duplicates.

    The unibiquitious nature of regular porn will make it tame and uninteresting through sheer availability. This will lead a secret cabal of Japanese scientists (and school girls) to create ultra-porn which will require co
  • .NET and DRM (Score:5, Interesting)

    by Mr. Darl McBride ( 704524 ) on Friday December 26, 2003 @07:44PM (#7815048)
    A lot of the push for faster computers is going to come from applications becoming substantially less efficient. Anyone who's used the Visual Studio .NET GUI for a major project or any other code written with .NET can tell you that it's nowhere near as lean as native Windows code, yet MS is pushing to migrate applications to .NET as it offers the company much more control of the platform.

    Similarly, the applications running in curtained memory are going to stack up at an alarming rate once Longhorn and other platforms start to see pervasive digital rights management. As every bit of data being generated or passed from application to application is being tested against dozens of different filters, CPU time is going to go up in smoke, and it will be illegal to stop these activities from taking place in most countries.

    • This seems to be an excellent point echoed (less succinctly) in some other posts. Some questions come to mind:

      a) How much of an edge are we losing as we evolve software and hardware? Is bloat and inefficiency cancelling out CPU performance increase? Or are we making net gains?

      Thought: We are making net gains, so the worry is really about efficiency. But we still seem to be inside the 80/20 criteria i.e. the extra effort to increase the efficiency of software is offset by the relatively small perfor
      • b) The remedy to pervasive DRA seems to be in OSS. Can we get to the point where OSS OS'es, wordprocessors, spreadsheets, and other office apps are "good enough" i.e. ubiquitous format and sufficient features to satisfy all but the most esoteric requirements?

        I suspect that the hurdles for OSS to avoid DRM will be legislative, not technical. I seem to remember that one senator already has pushed to enact a law that no computer without DRM be allowed to connect to the internet past some date -- take that wit

  • by skinfitz ( 564041 ) on Friday December 26, 2003 @07:52PM (#7815076) Journal
    Games and pr0n.

    And pr0n games.
  • Pornography, military, and gamers.

    Those are the early adoptors of nearly all technology, and drive the prices to points where normal financially sane people can afford them. Half of all technology every invented was driving by one of those three groups.

    For me personally, I use my desktop for lots, and lots of compiling. I'd like my desktop to be more responsive under heavy I/O load. I'd like it to do more things in the background. Personally, I've probably got enough CPU for my desktop. I found fin

    • > Pornography, military, and gamers.

      I wish I knew where this idea came from so I could debunk it properly. The
      military I'll grant; war has always been a driving force of technology. The
      other two I question. Games are driving new technology now, and have been
      for thirty years or so, but historically that's a blip on the radar.

      I would propose a different three things (well, two of them different): war,
      communications, and entertainment. (If you like, you can group porn and games
      under the umbrella of e
      • Oh, some also would say religion. The press, they would say, was developed to
        print the Bible. (This is at least partly true.) They would also point to the
        crusades as a major cause of the development of a lot of technology in Europe.

        I personally disagree with this assessment, though it has some validity.
        However, I believe that the Bible wasn't the *only* thing Gutenberg wanted to
        print. He printed it first because it was the best-known and most-revered book,
        but he wanted to print books in general, not _o
      • Go look at the early adoptors of a ton of technology, you'll find, gamers, the military, and pornography junkies.

        Lots of early movies we're in fact pornographic. A lot of early money in the film industry was made of pornographic movies. I'd cite it, but I learned that on the "History Channel".

        VHS tapes... Know what drove down the prices on VHS players and tapes? Pornography.

        Who are the early adaptors on DVD's? Pornographers.

        Who drives highend video quality on like HDTV? Uhh, yep, that's them Po

    • For me personally, I use my desktop for lots, and lots of compiling. I'd like my desktop to be more responsive under heavy I/O load. I'd like it to do more things in the background.

      Compiling will certainly soak your I/O channels which is why ramdisks are so useful. The downside to using a ramdisk to eliminate i/o as a bottleneck is it puts the cpu front and center as your bottleneck.

  • Re: (Score:2, Insightful)

    Comment removed based on user account deletion
    • --Red Faction lets you blow up walls... And people. Lots and lots of walls. And people. Did I mention walls?

      "It's an 88 Magnum. It shoots through schools."

      http://www.fredcorp.com/vortex/reviews/j/jdanger GR .htm
  • MS Office (Score:2, Funny)

    by stinkyelf ( 558533 )
    subject says it all
  • IDE, Bus speed (Score:4, Insightful)

    by BrookHarty ( 9119 ) on Friday December 26, 2003 @08:17PM (#7815143) Journal
    Anyone else notice how a system will pause when you put in a cdrom or format a floppy? This goes away when you have an SMP system. HT enabled p4's also remove this to some degree.

    I'd like to see some typical performance on these types of activities, things that can "pause" a system for a couple seconds.

    Loading websites with tons of thumbnails, searching hardrives with/without indexes (search pauses explorer). Programs that can Spike the CPU, use up all the buffer on a device, peg out virtual memory, freezes programs so you cant switch between them.

    More multitasking benchmarks with responsiveness being goal. All benchmarks I see are geared around 1 app, how fast can you go, not how smooth can it go. This is why everyone is so interested in Linux kernel 2.4 MM patches or 2.6 low latency patches, to make the system smooth and responsive. People notice these "lags" in windows and linux.

    • Re:IDE, Bus speed (Score:4, Informative)

      by Jeremiah Cornelius ( 137 ) on Friday December 26, 2003 @08:23PM (#7815172) Homepage Journal
      This is a function of I/O design, more than CPU horsepower. With all the fancy northbridge extensions and DMA, and whatnot - The PC is still a kludge architecture. A brilliant, category changing kludge, but still no great shakes in the I/O dept, when compared against real workstations and mid-range boxes.

      Just be glad you don't use an OS w/ hooks into BIOS routines for peripher access!

      • We include memory, cpu, sound, gfx in benchmarks, why not basic I/O?

        Seems one of the most important areas, seems to be left out.
        • A big deal w/ I/O is not just speed, but concurrency of operation.

          I can bring a MySQL server to its knees on a PC server, and remote shell to the box is impossible. On an old SPARC Ultra 80, ssh into the box isn't FAST, but it is possible for remote management.

          • On an old SPARC Ultra 80, ssh into the box isn't FAST, but it is possible for remote management.

            I've had similar issues with very slow ssh on old Sun hardware - much slower than similar speed PC hardware, and I suspect that it's the lack of a good random number generator.

            I haven't done any real analysis though, because I don't need to ssh in often enough to care.
      • Many years ago, maybe mid nineties, I was using an Alpha system. The CPU was pedestrian by modern standards but the I/O was fast. Already Digital had their fibre channels, but what really was interesting was that they supported a standard PCI bus. I think around 16 of them. Yep, that is 16 separate PCI busses.

        I know this cost a lot of $ in those days, but couldn't this be done a lot cheaper now by having multiple busses?

    • No amount of hardware can make it up for lousy software. But they still keep trying.

      Martin Tilsted
    • Re:IDE, Bus speed (Score:3, Interesting)

      by swankypimp ( 542486 )
      Although I'm typing this on a 3 gHz P4, the... delay... when loading a program or accessing a drive still bugs me. Back when I was in college (G3 days, four or five years ago), the Macs in our computer lab impressed the heck out of me on how responsive they were. (Probably because Apple designed the OS for a specific motherboard I/O layout.) Photoshop and other apps were objectively faster on the Intel machines, but psychologically they always seemed speedier on the Mac.
      • There is a difference between latency and bandwidth, and even on a slower machine having lowe UI latency (quicker response of SOMETHING on the screen) can make the program feel faster. Multithreading can help a big deal with this. Try BeOS one day to really feel the effect.
    • Anyone else notice how a system will pause when you put in a cdrom or format a floppy?

      What OS are you using? I just formatted a floppy today on a Mandrake 9.1 system and there was no impact to any other processes. Ditto for inserting a CD.
  • You need at least an Athlon XP or Xeon (P4EE) to run all your httpd when being Slashdotted by posting that your computer can survive a Slashdotting.
  • by cabalamat2 ( 227849 ) on Friday December 26, 2003 @08:21PM (#7815159) Homepage Journal
    Animated 3-dimensional paperclips, obviously.
  • Amateur filmmaking (Score:5, Interesting)

    by GuyMannDude ( 574364 ) on Friday December 26, 2003 @08:29PM (#7815190) Journal

    Right now CGI is still expensive enough that most independents and hobbyists don't include it in their films. Affordable, rapid CGI could be a possible killer-ap for high-performance hardware. Currently, professional moviemakers must agonize over the creation of any CGI effect. It's a tedious process that involves using wire-frame animation, rendering and so on. If this process could be speeded up and simplifed, it might encourage more widespread adoption of CGI effects among hobbyists, giving them the ability to make movies they never could have before with their limited budgets.

    Imagine being able to 'direct' a VR character almost as easily as you would direct a real life actor. When the technology gets to that level, we could see an explosion of new movies by people outside the Hollywood cookie cutter. Filmmakers with radicially new ideas who are too young to have developed a 'rep' in Hollywood could be creating some very professional looking films. Think of it this way: right now there are lots of people who write fanfics of their favorite movies or TV shows. But actually creating an episode of Star Trek, for example, is just not possible right now. With improved technology, perhaps these creative individuals might very well be able to make their own episodes, largely using CGI. Imagine taking a sci-fi movie that you like for the most part but hated the ending of. You load your CGI software with images of the main characters and battleships from the DVD and create CGI models of them. Now you can create a new ending that's more to your liking. Better yet, you can burn the new version of the movie with your ending (forget the "Director's Cut", this is "Mike's Cut") onto DVD and trade with your friends.

    Right now we are all still pretty much at the mercy of Hollywood to make films that we like. Very soon, the balance of power will shift and creative individuals who have lots of ideas but budgets nowhere near those of studios will be able to create some very impressive looking films. And then Hollywood will have to get their ass in gear and show us something that we couldn't do ourselves in our own living room.

    GMD

    P.S.: Several people have mentioned that pornography has historically been a big driver of technology. Can you imagine that boom that the adult market will get when people can make their own adult films using CGI characters? Think plots of porn flicks are stupid? Wish for something better? Hell, just load your CGI software with images of Jenna Jameson and make your own film with her as the star.

    • by Sheepy ( 78169 )

      GuyMannDude: But actually creating an episode of Star Trek, for example, is just not possible right now.

      You might want to reconsider that ;-)

      Fan-Made Star Trek Episode Available for Download [slashdot.org]

      Starship Exeter [mac.com]

      'Star Trek' reborn in online episode [twincities.com]

      The fresh episode is a digital product of a personal-computing revolution that has allowed amateur moviemakers to duplicate once-pricey television- and movie-production techniques on shoestring budgets.

      But classic "Star Trek" special effects added an eerie ai

    • by crisco ( 4669 ) on Friday December 26, 2003 @09:29PM (#7815361) Homepage
      The bottleneck in CGI is human time, not computer time. It takes much longer to create realistic effects and CGI than it does to render them. Simplification (and therefore a reduction in CGI costs) is going to come through software improvments, some of which will take additional CPU and most of which will involve long hours of tedious programming and tons of IP locked up by the producers of high end 3D software.
    • Right now we are all still pretty much at the mercy of Hollywood to make films that we like. Very soon, the balance of power will shift and creative individuals who have lots of ideas but budgets nowhere near those of studios will be able to create some very impressive looking films. And then Hollywood will have to get their ass in gear and show us something that we couldn't do ourselves in our own living room.

      While I won't defend the "quality" of most of Hollywood's mainstream products, I must say you ha
  • Lazy coding (Score:3, Insightful)

    by jbrandon ( 603700 ) on Friday December 26, 2003 @08:31PM (#7815198)
    What applications will drive the market? The same ones we have now, but written in python using bubblesort.

    Well, maybe not bubblesort. But non-optimal algorithms, slow languages, and copying memory instead of passing a reference/pointer will increase.

    That's not even a totally awful thing, because it means that the performance of already efficient applicatoins/languages/coders goes up as well. It also means that it's easier to start writing code without mastering TAOCP first.

    So encourage your friends to write game engines in perl.
    • How about the extension of lazy coding, coding by automated consultant or AI coding?
      Theoretically, code gets bigger as the interface gets easier (practically, it seems to just get bigger no matter the interface). As the coding interface is progressively automated, the code can theoretically get progressively larger.

      Vision: in 10 years Google has given rise to Hackle, where you can write a simple request for any program and have it ready to download in 5 seconds. But the result is Huge.

      Yow - I've got to
    • This seems like a fine time to mention Slow Sort [c2.com], which makes bubble sort look downright efficient and clever. From this paper [nec.com] (PDF [nec.com]).

      (The paper is fun if you know computer science, the c2.com link is in "normal English". Try the paper if you think you'd enjoy it; it has a dry wit and pursues its task of sorting as slowly as possible with great gusto.)
    • There are many apps that people love to use today that simply couldn't be done in assembly language, without increasing time and cost significantly.

      So, we wind up with, say, an app using Swing running on AWT, running on Java in a JVM running on Windows, running on a bunch of DLL's running in a C++ runtime (compiled to assembly), on a BIOS, on a RISC x86 emulator, which runs on microcode eventually, all so we can have a IM app with tons of features.

      Now, could somebody implement all that in straight assembl
  • Movies. That's the ticket. iPods that hold 4000 movies/TV shows, and desktop machines that handle them with the same ease that today's machines handle MP3's.
  • Software synthesis... [virus.info]

    I can think of a few others, but this one is the most fun.
  • Video editing is the most likely application to drive the mass adoption of 64-bit platforms.

    If Apple has any brains they would be busy porting their multimedia software to run on a full 64-bit version of OS X.

    Another application that could always use more performance is finite element analysis - but that's more of niche than video editing.

  • Business Prospective (Score:2, Interesting)

    by saden1 ( 581102 )
    In my line of work Image, PDF, SGML and XML processing in our production environment take way too much time on a single machine so we spread the task of processing these files between several machines using RPC. The bottle neck, however, is no longer with the CPU but rather I/O Read/Write operations. Of course multi-core CPUs would reduce our overall costs of having to purchase multiple machines but unless the I/O bottle neck is removed we really don't care for the improvements in CPU speeds.
  • ... such as 3D modeling, DNA analysys, protiens analysys, or anything that has to do with biomedical engineering.

    I know that the leasing package that I work on would gladly take advantage of this. Not just our end of period stuff, but also our interactive stuff. In fact this new hardware is making for great servers these days and replacing much of the old HP / Sun servers.

  • Spam,
    W*nd*ws,
    Home entertainment platform format PC's,
    De-fragging 500 Gb HDD's,
    Decision making s/w for window lickers,
    Cybersex suits - (plenty of scope for worms and virusses there),
    3-D holographic projection displays,
    Biofeedback or retina controlled MMI, (Man Machine Interface),
    Replicator units, /. proofing websites
  • Application startup (Score:3, Informative)

    by _iris ( 92554 ) on Friday December 26, 2003 @09:13PM (#7815313) Homepage
    Most applications have a good chunk of code they execute on startup (generating lookup tables, building graphs, etc). As processors get faster, applications will slowly move toward performing these functions in an on-demand manner. This will decrease application startup time, which seems to be what most MS Office users and Internet Explorer users want to eliminate when they say "My computer is _so_ slow."
  • As the amount of data available to individuals increases, visualizations that allow people to acutally understand what is going on (and the number crunching to get the visualizations) will/are driving system performance.

    In the pharmaceutical world, for most scientists, there isn't near enough computing power for what they would like to do on a daily basis. Grid computing is making major inroads because of that. Still day to day work could make better use of things, if it didn't take 1hr to get the picture
  • by Babbster ( 107076 ) <aaronbabb@NoSPaM.gmail.com> on Friday December 26, 2003 @09:40PM (#7815384) Homepage
    I don't see important new applications for the desktop causing people to care about ever-increasing processor power again - outside of the examples cited in the question, of course. What I do see as being important is smaller, cheaper and more powerful devices of other stripes.

    A good example would be Palm-type devices. As big-processor speed increases, there is also an increase in small-processor speed and efficiency (limited more by heat than anything else). This has given people a smaller, more powerful Palm-type device today than they could have bought five years ago. Another example is the DVR/PVR. The new two-tuner satellite HDTV receiver/recorders can handle the receipt and recording of two high-definition streams while decoding and playing back a third - my ancient Showstopper (ReplayTV), on the other hand, starts to chug when encoding/saving an NTSC transmission (at highest quality) while watching another (I paid &700 for my 20-GB Showstopper back "in the day" while the new 250-GB HDTV units will go for $1,000 and come down from there).

    It will be interesting to see how long non-PC devices take to catch up to current top PC speeds and what applications (especially portable, non-notebook apps) spring from that.

  • by UserChrisCanter4 ( 464072 ) on Friday December 26, 2003 @09:46PM (#7815390)
    Language translations.

    I recall seeing an algorithm that partially ignored traditional dictionary-type translations and relied more on a relational database. For example, rather than work word by word through a given sentence, it attempted to relate that sentence to other sentences and solve in that matter. If it sounds confusing, it's mostly because I read about it quite a while back, and really can't recall most of the details. A sentence such as "Comment Allez-Vous?" would literally translate as "how are you going", or something to that effect (Allez is the second-person plural of 'to go' in French), but is obviously more colloquially translated as "How's it going?" Rather than concern itself with the meanings of the individual words, this algorithm would know the meaning of that phrase and use it as sort-of guidelines for how an unkown phrase would translate. And I'm sure doing that properly, in realtime, with no errors would require a ludicrous amount of processor power and be ridiculously useful. Go ahead an couple that with the above-mentioned truly accurate voice recognition and you've got the legitimate workings of a device most would consider to be science fiction.

  • I can already peg a processor for several hours running reports off of about 1/3 of the databases we ahve in my current company. If I wanted to process the Peoplesoft databases, well, I can run it all weekend.
  • There will be no grammar ...

    Foredecker : "Of course, every day computing don't require a Intel 3.2Ghz P4 with Hyperthreading."

    "Computing" is an abstract singular noun. "Don't require" is a contraction of "do not require", and "do require" is a plural verb. I assume you were going for "doesn't."

  • What's driving faster CPUs? Distributed.net [slashdot.org] statsmongers!

    The G4/G5 is the best chip for RC5, by the way. :)

  • Gentoo.

    (or any from-source system)

    I find applications still need a lot more speed too. And it's not just inefficient coding I think. Detecting and removing most of the spam from my mailbox takes my mail filters about 2 minutes a day, seemingly regardless of the mail client I use.

    Virus scanners (and possibly other security methods) are still a big slow down. And I can't see things improving on that front.

    - Muggins the Mad
    • For linux at least, I think binary packages are on their way out. Too much hassle trying to manage dependencies across multiple versions. Everything will be compiled from source when you install it. I think many more applications will be written in Java as well.
  • This is the only real reason that I need more power than what I have (P4 2.4G, 1G ram).

    PC Virtual machine technology has completely changed the way that I design systems.

    More power? Bring it on!
  • Crazy new next-generation desktop environments will require a hefty speed boost to keep things snappy, for example I'm thinking of Java's desktop they're trying to create, with a 3d interface and lots of graphical bells and whistles.
  • Take a hint. Voice recognition has been vaporware since the 60s. Every futuristic movie shows a pretty lady talking to her house turning on lights and making coffee. We've got way too much CPU power in hour houses for that, so wheres voice recognition?

    The truth is, we dont need it. As impressive as voice recognition sounds, its application is limited to archival and military use.

    Next you'll say NVIDIA and AMD will design their systems around Duke Nukem Forever.
    • The truth is, we dont need it. As impressive as voice recognition sounds, its application is limited to archival and military use.

      I disagree. Though I haven't used it recently, back in the days of MacOS 9, I used Apple's speakable items quite often.

      Being able to lay in bed and ask the computer what time it is, if I have new emails, who's currently online, etc. is useful. (being able to control things with X10 even more so)

      Retail would be another application where voice recognition may be useful. Cu

  • Better, as in faster, cluster computing for one. Maybe we can get accurate weather predictions in the pacific north west. Say 99% chance of rain instead of 100%.
  • SETI@Home stats, that's how I choose my new PCs ;)
  • A certain portion of mathematical research will require computers. The algorithms are hard, run for very long times, and use amazing amounts of memory. When used as a research tools to discover counterexamples or evaluating cases in proofs future mathematicians will use all the horsepower we can generate. Making a computer a useful tool for mathematical research is still an open problem.
  • SETI and consumer stupidity (I know it's not an app.) When I used to do tech support for a PC manu, most people would ask "Hey, I want the biggest, fastest PC that you have... What is it?" I like how the poster has mentioned that we don't need a 3.2 Ghz machine to run word, check mail, or view sites but that's what most consumers do.

    I wonder what would happen if chip makers stopped making faster chips. How long would it take for most apps to catch up and for people to notice?
  • As our population ages, there's going to be a need for technology that lies somewhere between a human home healthcare aid and "the clapper" / roomba / "I've fallen and can't get up" panic buttons.
  • I want an app where you can write a screenplay and choose from a list of predetermined actions and actors, or create your own, and have the whole thing rendered out in at least pretty good CG. The actors and actresses could be thinly veiled versions of famous actors.

    You'd choose from drop-downs:

    Scene: Exterior; City Street; Bad neighborhood; Crowded; Night; Drizzling rain

    Actors: #1: 40s Italian American wise guy; #2: 30s African American smart aleck (here, you'd flip through characters and choose associa
  • Software Defined Radios (SDR) have a huge appetite for processor cycles. I use radio in the general sense, to include audio, visual and data transmission. Imagine plugging an antenna or cable into an ADC (analog to digital convertor) board on your PC. You could then run applications to listen to AM, FM or digital radio, watch analog or digital television, get the current time from GPS, receive IP packets from a wireless LAN or microwave link. Almost anything you now do with dedicated hardware could be repla
  • Soon enough, filtering spam with 100% accuracy is going to rival emulation of the human brain.

    It will be the Turing test of the 21st century.

  • So that i can get my Maya/Rhino/StudioTools renderings done ASAP. If i can shave but an hour off of a 35hr rendering, its a blessing.

    There will never, NEVER be enough cpu horsepower for raytracing. I will always want (todays highend)^2 power in my boxes. And strangely enough, i dont see my render times coming down at all. (Mmmmmmm... soft shadow precision. Mmmmmm... higher poly radiosity meshes. Mmmmmm... photons. Etc. etc).

  • CFD (Score:3, Interesting)

    by vogon jeltz ( 257131 ) on Saturday December 27, 2003 @06:40AM (#7816629)
    To answer the question:"What Applications Will Drive System Performance?" Scientists and engineers need *raw* computing power. They need lots of RAM (and even more) and they need GHz, lots of them. They need fast GPUs for 0. Consider Computational Fluid Dynamics. Most problems today are calculated on off the shelf PC hardware. I know this for a fact. Today, you don't buy Indys or Sparcs anymore, they're much too expensive bang-for-bug wise. They're nothing but dead. People use dual Athlons with 3 Gigs or RAM to run their jobs, mostly one job per CPU, because parallel processing in most cases is not there yet (solving the Navier Stokes equations in parallel is still a major PITA). So in
    short, those people will always buy the fastest PC stuff available, because for them it makes a huge difference whether a solution converges in two or in four days.
  • For me, the performance of a particular emulator was the key issue when I last replaced my processor. I know a lot of people running Gentoo and Slackware, though, and for them, compilation times are a key issue.
  • At my company, we do a lot of engineering design with solidworks and pro/e. For our needs, we don't need more raw power. Some of the engineers have graphics cards in the multikilobuck range, so an increase in power/$ would mean a lot to us. We also do a lot of science apps that involve analysis (= science version of photoshop) of tiff files that can be 200 - 500 Mb each. I can certainly envision needing to work with 50+ of these files at a time, say do a diff between file 1 Vs. file 2 - 50, and plot out the
  • You may have heard that these days, banks are no longer able to fly blind. They need to know the risk exposure of their financial positions in near to real time. This gobbles CPU. Particularly because some of them are coded in Excel or VBA!!!!!

Money will say more in one moment than the most eloquent lover can in years.

Working...