Energy Efficient Graphics Processors? 60
An anonymous reader asks: "The trends for graphics hardware these days seems to be to draw more power and create more heat to get faster processors and push more polygons. Yet in the CPU arena chips like the Via C3 and Epia, Transmeta Crusoe and Astro, Intel Pentium M, and IBM/Motorola PowerPC (G3-5) seem to favor more power per megahertz and cooler runnings without significant performance loss. Is this just because of the nature of the CPU versus GPU? I understand a GPU die is almost entirely reserved for calculation while the CPU is only 20% of so for calculation. Or are the graphics chip makers merely refusing to innovate and take routes that would reign in out of control energy consumption because of the race for more polygons? What kind of architectural changes could be implemented to alleviate graphics card power gluttony?"
Unfair comparison (Score:4, Interesting)
Unfair unfair comparison (Score:1)
Re:Unfair unfair comparison (Score:3, Insightful)
Re:Unfair unfair comparison (Score:1, Insightful)
Re: (Score:1)
Heatsinks for GPUs (Score:2, Interesting)
Re:Heatsinks for GPUs (Score:5, Insightful)
Theoretically, the more physical cooling you can give a chip, the more energy it can suck up (i.e. the more heat it can disipate). If anything, having less room for cooling should force energy efficiency (so that they don't have to disipate as much heat).
that's the theory. (Score:2)
sadly it is easier and cheaper to throw more raw power at the problem instead of designing more effecient.
Re:that's the theory. (Score:2)
Re:that's the theory. (Score:2)
In this case you have two alternatives: use a faster fan or use a bigger fan. Because there isn't much room to use a bigger fan one has to use a faster fan. Faster fan=more noise.
Re:that's the theory. (Score:2)
Re:that's the theory. (Score:2)
what i want to say is that the space (or lack of it) won't force energy efficiency (and it doesn't, too), it just forces more brutal cooling methods (aka faster fans)
Re:Heatsinks for GPUs (Score:5, Interesting)
Why don't they simply mount the GPU to the other side of the board to allow a much larger heatsink? I think this is either a design tradition or a limitation of the pick-and-place assembly machines, because there's no technical reason not to. I suppose if taken to an extreme, it could lead to physical fit problems in certain cases, but let's not go that far.
Better for watercooling too (Score:2)
This will also help with the watercooling hose layout. It's pretty hard to hook up a waterblock between two cards and keep the hoses from collapsing.
Re:Heatsinks for GPUs (Score:2, Interesting)
Filter capacitors. I've often found them mounted just behind the video card. A combination of a heatsink on the wrong side, along with these, would be a big problem, even if they didn't touch (most capacitors don't appreciate heat).
Re:Heatsinks for GPUs (Score:2, Insightful)
Re:Heatsinks for GPUs (Score:2)
Yes, portable and compact cases present an additional challenge. M
GFFX 5200 (Score:4, Interesting)
Re:GFFX 5200 (Score:1)
G5? (Score:3, Insightful)
Re:G5? (Score:5, Informative)
The Opteron "HE" is classed at 55 watts (I suppose that is at 2.0 GHz or so)
The P4 extremely expensive edition dissipates 103 watts at 3.4 GHz.
So in comparison to other desktop processors it does fairly well. Now there are efficient G4 class processors coming from Motorola the MPC7447 is said to dissipate 10 watts at 1 GHz.
I am not comparing any of these processors GHz to GHz because we all know that is not an accurate method of comparison. But I think it wrong to classify the MPC7447 as a desktop processor or even a processor for a desktop replacement type laptop. But then again maybe it's because after using OS X on a G5 I'll never take Motorola seriously (for the desktop) again. That's not saying Motorola is a bad company or their chips are bad! I develop almost exclusively on them at work, but then again I am an embedded developer.
Re:G5? (Score:2)
- 3.2 GHz Pentium 4 Extreme Edition: 1570 Base SpecInt (couldn't find the 3.4 GHz version sorry).
- AMD Opteron (TM) Model 146 HE: 1289 Base SpecInt.
I couldn't find SpecInt figures for the PPC970.
Anyway, the P4 is more than 20% more powerfull than the Opteron (as 20% is the figure for the 3.2GHz P4EE) and consumes 87% more power.
But there is a problem with these figure: performance/power ratio is not constant: it is easy t
Re:G5? (Score:2)
Why do I feel the need to throw the natural log of availble compilers in this equation? ;)
Re:G5? (Score:2)
The G5 is from IBM
Alex
Re:G5? (Score:2)
What's your point?
Re:G5? (Score:2)
Are they really that terrible compared to the G5? Any details?
Re:G5? (Score:3, Interesting)
It's just like when Mac users complained about the 'hot' G4 PowerBook, it wasn't much different than high-end P4 laptops of it's day, but Mac users expected cooler ma
You've already answered your own question (Score:4, Interesting)
You've compared high-end 3d desktop gamer cards which are excessive on heat and power to CPU chips which are designed for lower power low heat situations. The difference isn't nearly as pronounced with a more valid comparison on the CPU side, say a high end P4EE or Athlon64/Opteron. Also as you've stated, the GPUs are almost entirely dedicated to high-power processing, whereas the CPUs spend a lot of their silicon on other things. A high end GPU is generally superior to a high end CPU in terms of raw computing power. Therefore, it needs more power and makes more heat. If you forced intel or amd to build a CPU for you right now that had the raw compute potential of the latest high end cards, they'd have a hard time doing so without being just as hot and power hungry. All these things scale over time, but the demands of the user and his software scale up to meet it as well.
Re:You've already answered your own question (Score:2)
But games can't get enough GPU yet, so ATI and nVidia have to keep pushing, and the demand is there in the market to keep maki
Combined sources. (Score:1)
The relitively frequent implimentation of new chips doesn't leave that much time to push for optimisaions on a hardware level to make more efficent chips.
Game makers push the bleeding edge, without preforming the optimisations they were forced to in the past. Gamers want cards that play the games full out.
"More Power" and "Bigger is better" are phrases that people ned to be thrown out of the public's mindset.
Questions: Gamer Laptops/Lower Power (Score:3, Insightful)
What's the power consumption like on a GPU that isn't doing much? Do they sleep like some CPUs can, or are they always going at full bore?
3D for laptops... (Score:4, Informative)
That's what games want (Score:2)
Apples to Oranges?? (Score:2, Informative)
I introduce this document as reference. [ibm.com]
According to this, a PowerPC 970FX (the G5's being used in Apple's Xserves and the chip that will be
Re:Apples to Oranges?? (Score:2)
Moving on to video cards, both ATI and NVIDIA top line products target consumers that are wi
Re:Apples to Oranges?? (Score:2)
GPUs on the other hand are only a critical component in a small number of situations, like a gamer's desktop PC or a CAD workstation. Nobody cares about the GPU in a server or embedded system, which is why you'll find ancient S3 GPUs and similar in these types of systems.
So in a sense there are already power-efficient GPUs for systems that require them: the G
same reason P4's use put out so much heat (Score:2, Informative)
They're refusing to take routes that would reign in energy consumption.. specifically lowering clock speeds. GPU design has moved in leaps & bounds. They've been making architectural changes in order to increase performance w/out sucking down more energy, like more pipelines, wider memory bandwidth, and so on...
The biggest reason heat ou
Display Quality Too (Score:4, Interesting)
I tried a no-name GeForce 4 MX440 a couple of years back, but the display quality was awful. It was so poor I had to downgrade to 1280x1024 on my 19" Trinitron screen. After a few months the card broke and I went back to my TNT2 Ultra (Creative Labs) and back up to 1600x1200.
I was thinking about getting something fanless and by nVidia since their (binary-only) drivers are superb on Linux (I don't do the idealogical zealotry as much nowadays).
High-performance 3D is nice when you need it, but nowadays stuff is so powerful for under $100 that there's not much point to buying something really expensive. Some of us want a crisp, high-resolution display, flicker-free (70Hz+) without a great big noisy fan.
Re:Display Quality Too (Score:3, Interesting)
By the way, if you want crisp flicker free 2d, then why not just get a Matrox card? Nvidia and ATI are catering to a specific market so a 2d Matrox card might just suit your
Re:Display Quality Too (Score:2)
Re:Display Quality Too (Score:2)
Yes, I am not as green as I'm cabbage-looking.
However, this alerted me as to the wide difference in quality between graphics cards. Often nowadays in reviews there is little on the crispness of the display and much on the Doom 3/UT etc. framerate.
Unless I just stick to Creative Labs (assuming that every single one of their products is a good as the TNT2 Ultra I bought over 4 years ago : false premise) what should I buy?
Ther
Re:Display Quality Too (Score:2)
For 3D, a good brand of card is more important than the GPU--its the analog output filter design and component quality that matters.
Note that it is possible to remove or reduce the output filter, giving sharper images. It just takes some surface-mount soldering work. I did this with my card, and it made a huge difference.
Re:Display Quality Too (Score:1)
Re:Display Quality Too (Score:1)
Re:Display Quality Too (Score:2)
I'm a VERY happy camper, as ATI seems to always produce excellent-quality picture on my big Hitachi CRT.
I'm very sensitive to bad DACs on video cards, I had to toss an EPIA board and an Nvidia because I could see the 'blurriness' on my monitor, but the ATI cards have always done me ri
Re:Display Quality Too (Score:1)
The output filtering and general board components can have a huge effect on output quality, as well as board life, obviously. Maybe if you got a name-brand Geforce4 or Radeon it would be a decent comparison. Cheap no-name hardware is generally bad no matter if it's got a newer GPU or not.
Re:Display Quality Too (Score:2)
My question still stands: which "brands" are good and which are "bad"? Or are you too chicken to say in case you get sued?
ATI Radeon 9200 (Score:1)
No performance loss (Score:2)
Have you ever used a PC based on a Transmeta Crusoe? They have significantly poorer performance that the CPU they are designed to replace.
Low Power GPU (Score:2, Informative)
PowerVR [powervr.com] used to make a GPU with the transistor count of a Voodoo 2 card, but with the power of GeForces 2 of its time.
These days, they are using this technology to build very low powered GPU for embed systems.
But they announced that they will soone start again to build GPU for SEGA's arcade systems.
Let's just hope they'll soon built a PC derivative of this arcade GPU.