Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Graphics Software Hardware

Energy Efficient Graphics Processors? 60

An anonymous reader asks: "The trends for graphics hardware these days seems to be to draw more power and create more heat to get faster processors and push more polygons. Yet in the CPU arena chips like the Via C3 and Epia, Transmeta Crusoe and Astro, Intel Pentium M, and IBM/Motorola PowerPC (G3-5) seem to favor more power per megahertz and cooler runnings without significant performance loss. Is this just because of the nature of the CPU versus GPU? I understand a GPU die is almost entirely reserved for calculation while the CPU is only 20% of so for calculation. Or are the graphics chip makers merely refusing to innovate and take routes that would reign in out of control energy consumption because of the race for more polygons? What kind of architectural changes could be implemented to alleviate graphics card power gluttony?"
This discussion has been archived. No new comments can be posted.

Energy Efficient Graphics Processors?

Comments Filter:
  • Unfair comparison (Score:4, Interesting)

    by complete loony ( 663508 ) <Jeremy@Lakeman.gmail@com> on Tuesday May 04, 2004 @12:32AM (#9048426)
    The latest Pentiums are power hungry hogs too, if you want the latest and greatest it's going to be less effienct than it could be. Low power consumption, size of heat sinks, volume of fans are less of a design constraint that the raw power of the chip.
  • Heatsinks for GPUs (Score:2, Interesting)

    by Jmechy ( 656973 )
    Sure, we can put a full foot of copper on our CPU's, but everybody screams and moans when Nvidia builds a cooler that takes up the adjoining PCI slot. Graphics cards are limited with the space they can take up for their cooling solutions, and they certainly pay for that in heat generation.
    • by gumbi west ( 610122 ) on Tuesday May 04, 2004 @01:32AM (#9048666) Journal
      Um, your argument makes no sense.

      Theoretically, the more physical cooling you can give a chip, the more energy it can suck up (i.e. the more heat it can disipate). If anything, having less room for cooling should force energy efficiency (so that they don't have to disipate as much heat).

      • but in the real world the case is a bit different. remember that globalwin cak38 with that delta 6800 rpm fan? it was nearly as loud as a helicopter. cpu coolers are starting to get quieter, with the help of wider heatsinks and wider fans (80-120 mm instead of 60 mm). but video cards are getting louder and louder because they are getting hotter and hotter.

        sadly it is easier and cheaper to throw more raw power at the problem instead of designing more effecient.
    • by Myself ( 57572 ) on Tuesday May 04, 2004 @02:48AM (#9048951) Journal
      One of the problems is that PCI and AGP boards are "upside down" compared to ISA boards. Think component-side versus solder-side. In the case of ISA and PCI boards, it's important not to exceed a certain width because of adjacent slots, but since the AGP slot is always the first one, an AGP board could extend pretty far in the other direction.

      Why don't they simply mount the GPU to the other side of the board to allow a much larger heatsink? I think this is either a design tradition or a limitation of the pick-and-place assembly machines, because there's no technical reason not to. I suppose if taken to an extreme, it could lead to physical fit problems in certain cases, but let's not go that far.
      • > GPU to the other side of the board to allow a much larger heatsink?

        This will also help with the watercooling hose layout. It's pretty hard to hook up a waterblock between two cards and keep the hoses from collapsing.
      • by shepd ( 155729 )
        >Why don't they simply mount the GPU to the other side of the board to allow a much larger heatsink?

        Filter capacitors. I've often found them mounted just behind the video card. A combination of a heatsink on the wrong side, along with these, would be a big problem, even if they didn't touch (most capacitors don't appreciate heat).
      • by Anonymous Coward
        You don't need to take it to an extreme. Fact of the matter is it WON'T fit in many cases. There's stuff above the AGP card depending on the motherboard and case layout. Not every case is ATX you know and even some ATX layouts don't have room for it. It's pretty obvious it would be stupid for Nvidia or anyone to design a graphics card that only 10% of the users could fit in their case. you can still fit the newer Geforce cards in cases so long as you don't have a million PCI cards and it's not a small form
        • That never stopped VLB! Or for that matter, long ISA cards. The original carved-from-stone PC/XT case won't let a full-length ISA card mount in the rightmost slot. Many 8-bit ISA cards from before the 16-bit slot was common actually have an overhang that prevents them from fitting in 16-bit slots. There are also "tall" ISA boards that require a certain amount of clearance above the top of the slot bracket, preventing their use in some cases.

          Yes, portable and compact cases present an additional challenge. M
  • GFFX 5200 (Score:4, Interesting)

    by Hythlodaeus ( 411441 ) on Tuesday May 04, 2004 @12:47AM (#9048489)
    Most versions of the Geforce FX 5200 (non-Ultra) run fanless, which should speak to its relative energy efficiency. It also runs about as fast as a Geforce 3, unfortunately.
    • say what? my chaintech fx5200 non-ultra came with a fucking whining little fan, which drove me mad. i installed a regular heatsink and i could have fried an egg on it. so i took the fucking AMD boxed cooler from an athlon 1700 and bolted it to the mounting holes in the board. the fan is running at 5vdc instead of 12v so it spins slowly and it's just warm to the touch. i now have lost 3 valuable PCI slots because of the size of the fan. i don't know how i'm gonna fit the scsi controller, the audigy2 and the
  • G5? (Score:3, Insightful)

    by gumbi west ( 610122 ) on Tuesday May 04, 2004 @12:49AM (#9048495) Journal
    Since when is the G5 efficient? It is as hot as any of the others. The G4 is a cool processor, but it runs at speeds like 1 GHz. Not that it doesn't make an awesome laptop (I've never had a complaint about mine) but it isn't exactly a model of efficiency.
    • Re:G5? (Score:5, Informative)

      by bhima ( 46039 ) <(Bhima.Pandava) (at) (gmail.com)> on Tuesday May 04, 2004 @01:51AM (#9048749) Journal
      The IBM PPC 970fx draws 24.5 Watts at 2 GHz.

      The Opteron "HE" is classed at 55 watts (I suppose that is at 2.0 GHz or so)

      The P4 extremely expensive edition dissipates 103 watts at 3.4 GHz.

      So in comparison to other desktop processors it does fairly well. Now there are efficient G4 class processors coming from Motorola the MPC7447 is said to dissipate 10 watts at 1 GHz.

      I am not comparing any of these processors GHz to GHz because we all know that is not an accurate method of comparison. But I think it wrong to classify the MPC7447 as a desktop processor or even a processor for a desktop replacement type laptop. But then again maybe it's because after using OS X on a G5 I'll never take Motorola seriously (for the desktop) again. That's not saying Motorola is a bad company or their chips are bad! I develop almost exclusively on them at work, but then again I am an embedded developer.

      • by renoX ( 11677 )
        I think that SpecInt would be more interesting as a basis for comparison than GHz.

        - 3.2 GHz Pentium 4 Extreme Edition: 1570 Base SpecInt (couldn't find the 3.4 GHz version sorry).

        - AMD Opteron (TM) Model 146 HE: 1289 Base SpecInt.

        I couldn't find SpecInt figures for the PPC970.

        Anyway, the P4 is more than 20% more powerfull than the Opteron (as 20% is the figure for the 3.2GHz P4EE) and consumes 87% more power.

        But there is a problem with these figure: performance/power ratio is not constant: it is easy t
        • by bhima ( 46039 )
          So some sort of a (SpecInt/Watts)/dollars

          Why do I feel the need to throw the natural log of availble compilers in this equation? ;)

      • by Alex ( 342 )
        But then again maybe it's because after using OS X on a G5 I'll never take Motorola seriously (for the desktop) again. That's not saying Motorola is a bad company or their chips are bad! I develop almost exclusively on them at work, but then again I am an embedded developer.

        The G5 is from IBM

        Alex
        • by bhima ( 46039 )
          Yes, I think we all know this...

          What's your point?

          • Well I'm still unclear on what you are trying to say about the Motorola chips and why.

            Are they really that terrible compared to the G5? Any details?
    • Re:G5? (Score:3, Interesting)

      by MarcQuadra ( 129430 ) *
      The G5 is VERY efficient, using about half the juice of a similarly-powered P4. The problem is in perception, it's a lot hotter than any PREVIOUS Apple CPU offering and Apple case design tends to aim for more heatsink and bigger fans than small loud HSF combos. This leads to the idea that the G5 is a monster power draw when it is quite benign.

      It's just like when Mac users complained about the 'hot' G4 PowerBook, it wasn't much different than high-end P4 laptops of it's day, but Mac users expected cooler ma
  • by photon317 ( 208409 ) on Tuesday May 04, 2004 @01:07AM (#9048578)

    You've compared high-end 3d desktop gamer cards which are excessive on heat and power to CPU chips which are designed for lower power low heat situations. The difference isn't nearly as pronounced with a more valid comparison on the CPU side, say a high end P4EE or Athlon64/Opteron. Also as you've stated, the GPUs are almost entirely dedicated to high-power processing, whereas the CPUs spend a lot of their silicon on other things. A high end GPU is generally superior to a high end CPU in terms of raw computing power. Therefore, it needs more power and makes more heat. If you forced intel or amd to build a CPU for you right now that had the raw compute potential of the latest high end cards, they'd have a hard time doing so without being just as hot and power hungry. All these things scale over time, but the demands of the user and his software scale up to meet it as well.
    • To further the parent along, the high end CPU's like the Athlon64 and the P4EE are extremely under-used in most scenarios. There is enough processing power for todays OS's and apps. This is why intel and AMD can look at power reductive CPU's, because there's enough room to play around, nobody will really notice if it's a bit slower for the sake of making it quiet and conservative.

      But games can't get enough GPU yet, so ATI and nVidia have to keep pushing, and the demand is there in the market to keep maki
  • There's a lot of sources to this.

    The relitively frequent implimentation of new chips doesn't leave that much time to push for optimisaions on a hardware level to make more efficent chips.

    Game makers push the bleeding edge, without preforming the optimisations they were forced to in the past. Gamers want cards that play the games full out.

    "More Power" and "Bigger is better" are phrases that people ned to be thrown out of the public's mindset.

  • by wonkavader ( 605434 ) on Tuesday May 04, 2004 @01:17AM (#9048615)
    As gamer laptops get more popular, shouldn't we see new lower power GPUs with comparable muscle to the previous rev?

    What's the power consumption like on a GPU that isn't doing much? Do they sleep like some CPUs can, or are they always going at full bore?

  • 3D for laptops... (Score:4, Informative)

    by Kris_J ( 10111 ) * on Tuesday May 04, 2004 @01:25AM (#9048634) Homepage Journal
    A few seconds on Google and I found nVidia's mobile offering. A few more seconds and I found this [nvidia.com]. Undoubtedly ATI has something similar.
  • Modern games usually don't need super-fast CPUs; even 2 GHz is just fine for most everything. Past that point, bus speed and GPU processing rate are your bottlenecks. So there's less demand for super-fast, super-hot main processors among the gaming crowd, who'd rather spend more on keeping their video card current.
  • Apples to Oranges?? (Score:2, Informative)

    by dFaust ( 546790 )
    Ok, alot of people seem to be commenting on the comparison used here, saying the poster is comparing high-end GPUs to low-end CPUs. For one, the poster doesn't specifically target high-end GPUs... though we'll make the assumption that's what he's talking about. People have said that the CPUs are all low-end, and/or that the G5 doesn't have low power consupmtion...

    I introduce this document as reference. [ibm.com]

    According to this, a PowerPC 970FX (the G5's being used in Apple's Xserves and the chip that will be

    • I think you are near the real reason behind this trend. IBM is building a processor that will work in a slim line rack mount server or a blade server, which currently are the preferred server form factors. Imagine two full racks like this one with the PPC970fx and One with the P4EE. I would suspect that the Intel version would be challenging to cool adequately, while it would be that difficult with the PPC version.

      Moving on to video cards, both ATI and NVIDIA top line products target consumers that are wi

      • Or in other words, CPUs are a critical components of computer systems in a variety of markets, from embedded solutions, to servers and desktop PCs.

        GPUs on the other hand are only a critical component in a small number of situations, like a gamer's desktop PC or a CAD workstation. Nobody cares about the GPU in a server or embedded system, which is why you'll find ancient S3 GPUs and similar in these types of systems.

        So in a sense there are already power-efficient GPUs for systems that require them: the G
  • "Or are the graphics chip makers merely refusing to innovate and take routes that would reign in out of control energy consumption because of the race for more polygons"

    They're refusing to take routes that would reign in energy consumption.. specifically lowering clock speeds. GPU design has moved in leaps & bounds. They've been making architectural changes in order to increase performance w/out sucking down more energy, like more pipelines, wider memory bandwidth, and so on...

    The biggest reason heat ou

  • Display Quality Too (Score:4, Interesting)

    by turgid ( 580780 ) on Tuesday May 04, 2004 @05:09AM (#9049349) Journal
    I'm still using my TNT2 Ultra which I bought in the Xmas sales in 1999 in my Athlon. The display qulaity is superb, but it's really long in the tooth. I bought it for playing Quake 3 on Linux (I don't have Windows) and nowadays I don't do much gaming, but I'd like to upgrade and migrate this excellent card into one of my old machines.

    I tried a no-name GeForce 4 MX440 a couple of years back, but the display quality was awful. It was so poor I had to downgrade to 1280x1024 on my 19" Trinitron screen. After a few months the card broke and I went back to my TNT2 Ultra (Creative Labs) and back up to 1600x1200.

    I was thinking about getting something fanless and by nVidia since their (binary-only) drivers are superb on Linux (I don't do the idealogical zealotry as much nowadays).

    High-performance 3D is nice when you need it, but nowadays stuff is so powerful for under $100 that there's not much point to buying something really expensive. Some of us want a crisp, high-resolution display, flicker-free (70Hz+) without a great big noisy fan.

    • by Naffer ( 720686 )
      My Nvidia 5900 ultra bit the dust early last year I was forced to pull out my old Creative Labs PCI TNT2. I run a 21" CRT at 1600x1200x85 and the display was SO bad with the TNT2 that I had to drop it to 1024x768x85 before the screen was really usable. Older cards just really often don't have the RAMDACs to keep up with nice monitors.
      By the way, if you want crisp flicker free 2d, then why not just get a Matrox card? Nvidia and ATI are catering to a specific market so a 2d Matrox card might just suit your
    • You realize your problem was the "no-name" part and not that it was a GF4mx, right?
      • You realize your problem was the "no-name" part and not that it was a GF4mx, right?

        Yes, I am not as green as I'm cabbage-looking.

        However, this alerted me as to the wide difference in quality between graphics cards. Often nowadays in reviews there is little on the crispness of the display and much on the Doom 3/UT etc. framerate.

        Unless I just stick to Creative Labs (assuming that every single one of their products is a good as the TNT2 Ultra I bought over 4 years ago : false premise) what should I buy?

        Ther

        • If all you're looking for is 2D, go with Matrox.

          For 3D, a good brand of card is more important than the GPU--its the analog output filter design and component quality that matters.

          Note that it is possible to remove or reduce the output filter, giving sharper images. It just takes some surface-mount soldering work. I did this with my card, and it made a huge difference.
        • I bought a creative gf2 mx card awhile back. The thing didn't come with a fan or heatsync, just a bare chip, so it'd lock up when running a graphics intensive game. I had to buy a GPU fan to stick on it to make it run stable, I remember having other problems with it as well, but I can't remember what they were, but I do specifically remember vowing to never buy a creative graphics card again, part of the problem might have been that is WAS just an mx card, but it soured me enough to no longer like creativ
    • I just 'upgraded' from an original-series RADEON to a RADEON 7500 from Crucial, there's virtually no difference performance-wise, but the newer GPU doesn't need a fan and the chip count on the board is much lower on the 7500.

      I'm a VERY happy camper, as ATI seems to always produce excellent-quality picture on my big Hitachi CRT.

      I'm very sensitive to bad DACs on video cards, I had to toss an EPIA board and an Nvidia because I could see the 'blurriness' on my monitor, but the ATI cards have always done me ri
    • So... you're comparing a no-name GF440-MX (budget GPU and Brand) to a generally good quality name brand TNT2 Ultra (Creative.)

      The output filtering and general board components can have a huge effect on output quality, as well as board life, obviously. Maybe if you got a name-brand Geforce4 or Radeon it would be a decent comparison. Cheap no-name hardware is generally bad no matter if it's got a newer GPU or not.
      • I kind of assumed that nowadays all the boards were the same other than the brand name on the box. I thought they were all made by either nVidia or ATI and rebadged, and then shipped with different free games and cables and mousemats. Obviously I was wrong, which I freely admit to, since I am not perfect unlike the whinging prima donnas and know-alls around here.

        My question still stands: which "brands" are good and which are "bad"? Or are you too chicken to say in case you get sued?

  • Getting a bit old at this point, but it runs fanless. Power Color (not the best brand name in video) also sells a low profile version for those bastardized dell desktop cases.

  • The trends for graphics hardware these days seems to be to draw more power and create more heat to get faster processors and push more polygons. Yet in the CPU arena chips like the Via C3 and Epia, Transmeta Crusoe and Astro, Intel Pentium M, and IBM/Motorola PowerPC (G3-5) seem to favor more power per megahertz and cooler runnings without significant performance loss.

    Have you ever used a PC based on a Transmeta Crusoe? They have significantly poorer performance that the CPU they are designed to replace.
  • Low Power GPU (Score:2, Informative)

    by DrYak ( 748999 )
    3 words : Tile Based Rendering

    PowerVR [powervr.com] used to make a GPU with the transistor count of a Voodoo 2 card, but with the power of GeForces 2 of its time.

    These days, they are using this technology to build very low powered GPU for embed systems.

    But they announced that they will soone start again to build GPU for SEGA's arcade systems.
    Let's just hope they'll soon built a PC derivative of this arcade GPU.

Life is a healthy respect for mother nature laced with greed.

Working...