Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Hardware

Why Do Graphics Cards Cost So Much? 85

Tamor writes "As an avid PC games player I'm locked into the perpetual hardware upgrade cycle like everyone else, but one thing really irks me. While other hardware has come down in price, graphics card pricing has spiralled beyond belief. Not only are graphics cards usually the single most expensive item in a gaming PC, they don't seem to be subject to the usual market forces. Instead of new generation cards forcing down the price of old cards, the old cards are simply phased out, and the likes of nVidia have a range wide enough to keep the high-end cards at the same prices for the forseeable future.

Why is this? Why does a top of the range graphics card cost so much more than an entire PS2 or X-Box system? Is it the lack of competition in the market following the demise of 3DFX or are there other forces at work. What do slashdotters thing about this pricing?"
This discussion has been archived. No new comments can be posted.

Why Do Graphics Cards Cost So Much?

Comments Filter:
  • What do slashdotters thing about this pricing?

    /me attempts to find a verb. fails.
  • by Jerf ( 17166 ) on Wednesday October 30, 2002 @08:28PM (#4569718) Journal
    Maybe you need to look outside of your local chain megastore.

    I don't want to link to their website because there's no reason for them to sustain the bandwidth hit, but my local little chain store has a TNT2 32MB for $40, and that's still a lot of graphics card if you're not a FPS player. Heck, my little TNT2 8MB I got at that price a year ago is still respectable for most uses.

    They have a pretty smooth progression from that up to top-of-the-line cards, such as a GeForce2 MX200 32MB for $60, a GeForce4 MX 64MB DDR for $120, and so on up to $350 TI4600 128MB. In all, there are 8 nVidia-based choices and 10 ATI choices ranging from $60-$400.

    I don't think the problem you complain about exists for real.
    • GeForce2 MX200 32MB for $60? Please tell me you're joking. I can get a GeForce2 Ti 64MB for ten bucks less than that right now [pricewatch.com]. Your "local little chain store" is eating your wallet for a lunch.
      • Local chain store eating your wallet? I don't think so. I actualy sat and figured out the cost of buying a part from a place on pricewatch and from Compusa.....guess what? They were about the same by the time you add shipping.

        I always buy the boxed product mostly because sometimes the boxed product has some perks with it. Sure, it may be an old game or something, but it's still a perk. Plus I have never felt that comfortable buying a board in a bag from joe blows computer store in a town I have never heard of. Also, if you never spend a dime at CompUSA or any other local computer store, they will eventually pack it in if enough people do it. Then when your in a dire need and can't wait for a product to ship they are not there. I almost never buy any computer part mail order. I personally like sinking my dollar as much as I can anyway into places that are here. It's kind of my insurance that I have a nice store locally that I can go to any time instead of having to wait for a part or item to get to me. I personally also HATE having to wait when I know I ma getting a new toy. I have waited long enough in some cases why should I wait longer? :)
      • For that extra money, he is probably getting extra service and security.

        I've been an avid Pricewatcher, but I've started eating significant price differences by going to Best Buy instead because it seems as if the quality of Pricewatch vendors has been dropping like a rock - Return policies have become much worse, and the number of items I've received DOA from Pricewatch vendors has skyrocketed in the past few months.

        Yes, there are better reputable vendors on Pricewatch (Like NewEgg), but often they don't have nearly the price advantage that others have, and once they do, I'm willing to spend the $10-20 extra for an item that I can return for an exchange in 30 minutes rather than 2 weeks. (Thank God I bought my LCD monitor at BB - It had a defect, I brought it back the next morning and had a perfect replacement in 30 minutes.)
  • It's simple really (Score:3, Insightful)

    by arcadum ( 528303 ) on Wednesday October 30, 2002 @08:29PM (#4569722)
    The prices are set according to what the market will bear.

    If a critical mass of perpetual upgraders suddenly became content and the manufactures were left with only new PC sales for income we would see a price war like the CPU market.

  • by drivers ( 45076 ) on Wednesday October 30, 2002 @08:30PM (#4569726)
    I think it's really a matter of realizing you don't have to have the best video card or all the hottest new games. "You are not your AGP graphics card."

    If nVidia isn't giving you what you want, buy a different brand. That is part of the problem... so many people are willing to pay top dollar for the best available card that all trailing companies go out of business or just can't compete.

    Actually I don't see the problem here even with nVidia... I get by fine on cheap video cards from them; nVidia typically offers a top of the line card and a budget line... I buy budget line... in fact I'm using an nForce chipset built into the mother board.

    Another thing is that you don't have to buy a new card that often. The longer you wait and make do with what you have the more satisfying the upgrade is anyway.
  • ...because people will pay that much for them
  • not true... (Score:4, Informative)

    by dotgod ( 567913 ) on Wednesday October 30, 2002 @08:35PM (#4569754)
    of new generation cards forcing down the price of old cards, the old cards are simply phased out

    Wrong...check out PriceWatch [pricewatch.com]. You can still get old stuff from some suppliers for a decent price. For example, there's a GF3 TI200 for 70 bucks and A GF2 TI for 50 bucks. Even the prices on some of the newer GF4's are reasonable.

  • Top of the Range (Score:4, Insightful)

    by greenhide ( 597777 ) <`moc.ylkeewellivc' `ta' `todhsalsnadroj'> on Wednesday October 30, 2002 @08:37PM (#4569759)
    I think it has to do with the perceived "users" of these cards.

    Like the poster above pointed out, there are perfectly acceptable graphic cards out there for very reasonable prices.

    However, when you want the "top of the line" card, you're making a different kind of statement. It's similar to those who purchase top of the line stereo equipment. I have a cheap bookshelf system I bought a K-Mart for around $150 bucks. It's a perfectly fine stereo system, I listen to it all the time. However, if I wanted a top of the line stereo system, I would have to pay at least five times as much, if not more. The price discrepancy is based on quality, on workmanship, but also...on status. Having a really souped up stereo system is also a statement. Part of your purchase price goes into that statement.

    The same thing goes for graphics cards. Once you get beyond "normal" use and start wanting to have "the best of the best", expect to pay more, not just for the cost of the item itself, but for the additional "status" benefits that it allows you.

    This status thing applies to every aspect of commercial life. Think t-shirts. Just how much better is a Versace t-shirt than the kind you can get at a chain department store?

    For all I know, I may be totally wrong. Maybe the price is more because the components or manufacturing process is more expensive. But I think that if the cards were lower priced, some people wouldn't believe that they *were* top of the line.
    • I think this is true. While basic economic theory says that demand increases as price decreases, there are many counter examples to this. For instance people will buy more steak if the price is higher, to a point. People deem that some products are less valuable if they are cheaper.... The moral of the story: an easy way to get rich is exploit these people.
  • While other areas may be able to justify the need for faster processors, moer RAM, bigger hard drives and the like, 90% of the people buying this high end equipment can make do with a reasonably priced video card or with the one built into the motherboard. Gamers are well known to want to push the envelope, and while the rest of their components might somewhat resemble a business system, their video card will not. So, NVidia will likely not sell as many GeForce8MXRX32 with 1 Gig of RAM, but they still need to make up the R&D money and let's face it -- you just said yourself that you pay for it.
  • by nelsonal ( 549144 ) on Wednesday October 30, 2002 @08:42PM (#4569778) Journal
    They cost significantly more to manufacture than the graphics subsystems in a console, for several reasons. First, due to lower sales figures. I have seen some market share data that suggested that high end cards (priced above 300 USD) have less than 1% of the market, but I don't know if that is revenue or units, if its units that is only about 1 million chips/year, its even less if dollars. One design will go into many consoles (Sony recently announced that sales of PS 2s reached 40 million) over a very long period of time, so you can spread your fixed costs over quite a few more chips. Also, consoles are used at very low resolutions so they are not at technically rigorus as PC cards. I think the XBox video system is similar in fill rates to a Gforce 4 MX, but has better AA. Even a three year old card will play new games pretty well at 640x480. The cards do have pretty rigorous AA especially the X-Box.
    Finally, Sony and Microsoft are bigger buyers than you and I are, and they get lower prices for buying in volume.
    • See, the reason the Xbox is so awesome at fill rates and AA is that it's running at way below standard useable resolutions. I'm talking 480x320, and that's not exactly a lot of pixels. Hell, you can get by on a cheap card at friggin low resolution, but you lose a lot. I find myself shooting at elevators and such in Unreal Tournament on my four-year-old iMac because I can't tell the difference between them and people. You must realize that there is a resolution cap on the XBox, and the video card in it is acceptable at that resolution. Those of us running UT 2003 at 1600x1200 with 4x AA and high detail, on the other hand, aren't using the XBox.
      • D'oh. /me slaps himself around with a large trout
      • thats not true, every Xbox game outputs at NO LESS THAN 640x480. While many TVs may not be able to show all those lines of resolution, that is the output from the box. Some games even use higher resolutions like 800x600 and 1024x768, and many use AA to make up for the somewhat low resolutions. Plus, did you forget that the Xbox is GeForce 3 based? How is THAT a cheap card? Thats why microsoft was losing over $100 per console BEFORE they lowered their price to $199 (from $299). Production prices are going down for them, so I doubt they are loosing over $200 per console now, but they are deffinately still losing money. Plus, the Xbox is a dedicated machine, unlike your computer, and its much more efficient (clock for clock, pixel for pixel) than your PC. Just get your facts straight.
    • I think the XBox video system is similar in fill rates to a Gforce 4 MX, but has better AA.

      The X-Box doesn't have better anti-aliasing. You probably need to hit the TV on the side or something. Also, check your cables and make sure they are pushed in all the way.

      Now, a bottle of Sterling Tanqueray will give you really good anti-aliasing, it just might not be limited to just games on your TV...

      • Personally for my own AA needs I prefer a nice stout or white ale, but there certainly isn't anything wrong with Tanqueray. In my reference to the PC chips, I was actually remembering one of several articles that tried to compare the X-box to the GeForce line that were so popular around the launch. I don't actually own one yet, I am planning to pick up one of the bundles with the two games and small controller after I have all the new car expenses under control.
    • Geez... why is this only score:3?

      Basic economics, here. Supply and Demand.

      You're ticked that it costs so much, but you own one, right?

      There's your answer.
  • First of all. PS2 and XBOX systems have custom chips designed to be cheap to produce and are married to logic boards that eek every last bit of performace out of them. Couple that with long productions run and you get a cheap per unit cost. PS2's and XBOX's, unlike PC's, are not locked into the hardware upgrade cycle. Instead, they have a product lifespan nof just a few years.

    That being said, I guess your question is really why pay 400 bucks for a graphics card? I don't know. The last card I bought was a Voodoo Banshee. I guess you can buy a real good card (64med and DirectX 8.1) for less than a hundred bucks, and that probably gets you PS2 or XBOX quality graphic, more or less.

    The real high priced items are for cutting edge, which people are always willing to pay for. Look at TV's, stereo components, whatever. People always pay a premium to be an early adopter, or to be in the top 5 or 10 percent, or just to brag to thier freinds that they have that nifty new Invadeon 5 million card that can play Morrowind at more than 10 FPS. Whoop-tee-do!

    My advice. Buy cheap card. Play old games.
    • First of all. PS2 and XBOX systems have custom chips designed to be cheap to produce and are married to logic boards that eek every last bit of performace out of them. Couple that with long productions run and you get a cheap per unit cost. PS2's and XBOX's, unlike PC's, are not locked into the hardware upgrade cycle. Instead, they have a product lifespan nof just a few years.

      The mass production angle is an important one, but not the only one. PS2 is (or at least was) sold at a break-even point. XBox was sold at a loss. Neither have enough GPM (gross profit margin) to recoup their development expenses. Thus, it's important to remember that PS2 and XBox are subsidized by software sales, since their respective manufacturers collect licensing fees on games sold for those platforms. In contrast, NVidia et al. do not charge heavy licensing fees to game developers to get them to use their cards. Heck, it's more like the card developers go out and evangelize their specific technology tweaks to try to get buy-in -- they practically beg for people to develop to their cards, as opposed to slapping them with onerous license fees.

      --Joe
      • PS2 did NOT sell at a break even point when it was released, it was sold at a loss. After cost savings from a long production run, they eventually reached a break even point (not sure if they are still there after the price drops). Every console that I know of since the N64 has been sold at a loss upon introduction.
  • by bellings ( 137948 ) on Wednesday October 30, 2002 @08:46PM (#4569804)
    What do slashdotters thing about this pricing?

    Frankly, I think that you are smoking waaaaay too much crack.

    The price on the high end of the consumer market has slowly crept up in the last five years, from about $200 for the top-of-the-line 3dfx Voodoo when it came out, to about $300 for the top-of-the-line nVidia GForce 4 today.

    But on the low end, the prices are as cheap as ever, while the performance on the low end is simply incredible. A GeForce4MX for $75 today is going to be faster than the best $250 card you could buy two years ago.

    There are two reason why you can't walk into BestBuy and get an old TNT2 Ultra for $35. First, because just handling quality control and returns makes it not worth their time to sell you a card that cheap. Second, because despite the fact that the TNT2 was fair to decent two years ago, it is just butt-slow by comparison today. The only people buying boxed 3D cards are gamers, and they're just too smart to do something that stupid.

    If you want to see how performance has improved in the last few years, check out this Tom's Hardware guide to VGA cards [tomshardware.com]. And you're asking why someone wouldn't sell you one of the cards near the bottom of the chart? The question you should be asking is what kind of moron would be stupid enough to buy one of them?
    • I'm stupid enough. If your only metric for choosing a card is the current state of the art, then yeah maybe some of those late model cards look kind of shabby now, especially to gamers. But not me. They weren't crap then, and if they meet needs now they still aren't crap. Personally, I'm not a gamer, but I'd like to find a nice minimalist card that can get my Mac G4 to run Quartz Extreme graphics. I'm interested, but not to the point that I want to spend $200 or more on it, which as near as I can tell is what the minimum hardware costs at this point. If it fell to maybe a quarter of that price I'd be willing to go for it, but for the work I do -- where almost all of it is in a unix shell anyway -- it doesn't make sense to spend that kind of money on video hardware, even if it would make things prettier. But for $200, that's a big enough fraction of the way to a whole new computer that I personally just don't see the point.

      So please, just because someone has different priorities than you do doesn't mean that they're stupid.

      • Here y'go, my friend.

        As I've said before in this thread, the 'card companies have got their pricing a little beyone the range of mere mortals, whether they purcharse Macs or not...

        Here's a to show you how to flash a Radeon 8500 ($229 retail in the Mac market), and [eon.com.au] here's a link to show you where to get it for $89 (US) [ncix.com]. It's the exact same card. Just slightly over $240 difference in the retail price once you factor in shipping.

        I made a point, a lot further down in the thread, and I'm restating it here. The video card/peripheral vendors price their items just below their (*perceived*) point of pain for the markets they're targeting.

        Gamers (particularly Morrowind players -- as I've found the past week trying to play a damned game I *bought*...) seem to be the mid-high demographic, while Maya/Pro3D/Mac users seem to be the high end of their profile.

        For the record, I wish they'd cut that shit out -- so the people who didnt' know any better [slashdot.org] could buy the same gear at the same price point [slashdot.org] without having to do the whole run-around.

        Yes, virginia, you're being *SEVERELY* overcharged for your new über video card. The only things to do about it is not buy it until the manufacturers' markerters realize that you've figured out their game -- or buy the Wintel versions (which are *still* ridiculously overpriced) and flash the ROMs.

        (In retrospect...)

        Wow, how self-referrential was that? You'd think I've been out drinking on Halloween [portalofevil.com] or something... at least smoking some crack [macosxhints.com].

        --dr00gy
        • The Mac version is not overpriced. It costs more because they have to recoup the cost of driver development for the Macintosh. It's a problem of volume. Their costs for developing the BIOS (OpenFirmware FCODE, most likely) for a Mac are probably similar to the cost of the PC BIOS, possibly more, but there are fewer people buying it.

          The end result of your solution (just buy the IA16 card and flash it with the OF BIOS) is that there will not be a OF BIOS available at all, since it will no longer be profitable for ATI to develop it. Think about the consequences of your actions.

        • It's true, you think people would have found PriceWatch by now. I've been using it for years, and I have yet to run into a single problem, and I've saved thousands of dollar, particularly on video cards and CPUs. On top of that, I don't have to deal with clueless salespeople, weird store layouts, or the hassles of having stuff out of stock, etc. I really feel sorry for those people getting charged twice as much for stuff at CompUSA.
    • The question you should be asking is what kind of moron would be stupid enough to buy one of them?

      A moron like me? I bought a 64 Mb Nvidia 400mx about 6 months ago, cheap. It runs all my games really fast, the drivers are excellent, and I am completely satisfied with its performance.

      Seems like the morons are the suckers who spend too much on expensive video cards that they don't need. Who needs 300 frames a second anyway?
      • I think gamers would want that many frames/sec (if they had monitors running at 300 Hz) when they go online to play against other humans. It's like when they play a game single player off the net, they'll push their video card to the limit so it looks purdy. But put them online with other people to play against, and the settings get scaled way back so they can get that extra edge from the smoothness and fluidity of the game.

        I suppose I should include myself in this.
  • In the long run, video cards will never out-price the most expensive PC peripheral ever to exist.

    The internet.
    • surely monitors are the most expensive part of a PC not the graphics card... I paid £350 for my monitor, even the most current graphics cards cannot be bought for that price.
      • Sure they can, a Radeon 9700 retails at $400. Besides, you can get a cheap 17" CRT for $100 these days. It all depends on what you want to spend your money on.
        • Re:hmmm (Score:1, Informative)

          by Anonymous Coward
          You're mixing £ and $, dude.

          Do you want to live in a society where you have to measure your money in pounds?
  • Well, one possibility is that new graphics chips come out every few months, whereas other components tend to change relatively little over longer periods of time.

    Consider the Athlon and the GeForce. The Athlon was initially released in the middle of '99. Up through the Palomino (over two years later), very little was changed about the design, with the exception of moving the L2 cache onto the die. I don't know if Thoroughbreds are different from the rest of the Athlons. Most athlons today are Palominos. (this is 3 years after the "classic" athlon launch)

    The original GeForce (SDR) came out some time in 1999 or 2000, followed soon by the GeForce DDR and GeForce2. We are now up to the GeForce4. Some people might argue that the GF2 wasn't veyr different from the GF DDR, but even so, that is at least THREE times the designs. This is not even considering the various "MX" lines, and the GF4 ti4200 vs the 4400 and 4600. (Admittedly, the Duron lines for AMD went through one revision and are a different design than the Athlon, unlike Celerons vs. P3's and P4's).

    So.... nVidia is pumping out new designs (read: more R&D costs) more frequently, AND has a smaller market. Not everyone needs a fast video card (most people don't even know it matters) but EVERYONE buys into Intel MHz-marketing (you need a P4 for the internet, etc.)
  • Supply and demand.
  • Bandwidth (Score:2, Insightful)

    by bootprom ( 585277 )
    Bandwidth to graphics cards has been a giant bottleneck up until a few years ago. Back in the ISA days it didn't matter how many pixels your GPU could crunch - the problem was getting the info to it. Either the bus was too slow or the CPU couldn't generate it fast enough. Now that so much functionality has been offloaded to graphics processors and they can be fed information fast enough, it make sense to have processing done there. A new-fangeld GPU has about as many transistors as a newfangeld CPU - why shouldn't they in the same ballpark when it comes to price?
    • I wonder how long it will take them to go to a socketed design for the GPU and memory - so you can do the ole' upgrade dance with that (unless the rest of the card is so cheap already, which the case may be)?

      I wish they would just go to a backplane design for PC hardware and get it over with - one card for CPU, another card for memory, another for sound - each card with upgradable memory and processing units, so that you could build a dual or quad machine easily, with as many graphics/sound/memory boards as needed. Make the bus AGP or something faster.

      This isn't a new idea, mainframes and mini computers have used this design since, like, forever - so why hasn't the PC done this (actually, the first hobbiest computers were like this as well - Altair, Sol - with the S100 bus - then the Apple came along and changed all of that)...

      • Look for any info you can find on the GeForce 2/4 Go - I believe these have the GPU and memory and not much else on a chip-like substrate, everything else is on the system board and doesn't change with video card upgrades.

        GF4 Gos are $150 - That's DIRT CHEAP considering the performance that card delivers and the fact that it's a low-volume laptop part. Probably because 50% or more of the video card is on the motherboard.
  • The answer is because old graphics cards don't have much utility (at least to gamers). This is because old games don't have much utility. Actually prices of old graphics DO go down...but since old graphics cards are not competition for new graphics cards (two old graphics cards will not allow you to play Quake3, as two old hard disks are more or less equivalent to a new one), I don't see the incentive for prices going down.

    Anyway, that's my wild guess. IANAMicroeconomist
  • by 0x0d0a ( 568518 ) on Wednesday October 30, 2002 @09:11PM (#4569945) Journal
    Okay, you're complaining about $400 graphics cards? What's the big deal? There are also $80 graphics cards out there. Buy those.

    "No, I need the best, " you think? The same goes for most other markets that have a broad range of quality/price options. You can buy a high-end Porche, but you're going to pay for it.

    You can get an $80 graphics card. The cost for that may well be running games a bit later than the $400 people do, or running with less resolution, or whatever. But that $80 graphics card destroys a few-year-old $300 graphics card, so it really isn't that big of a deal. You're getting a better deal these days -- you just can't buy top of the line. No biggie.
  • Graphics cards and chipsets are expensive. There's just no way around that at. Current graphics chip die sizes make current cpu die sizes seem patheticly small. They have very wide, rather complex memory subsystems. Most graphics cards now are 256 bit bused to memory, that's 4 times what's common on the desktop. That means the pcb cards themselves are quite expensive. The pace of the market is staggeringly fast, a new architecture every year, a process shrink every 6 months. Compare that to AMD having a new architecture every 4 years, and intel every 6 or so. I believe nVidia has 3 parellel design teams in order to keep up the pace of releases. The market simply doesn't forgive anyone who doesn't hit these performance points. Take a look at how little the matrox card is selling.

    Some companies are trying to move down. Trident's latest offering is actually quite clever at getting DX9 capability at a very low price.

    But the fact is, you've got a low volume market with expensive chips, boards, memory and design, where buyers punish medicre offerings. That's just the way it is. If it were so easy to make a GF4 class card at a sub $100 price point, don't you think someone would have done it by now to rake in the $$?
  • nForce (Score:4, Informative)

    by Apreche ( 239272 ) on Wednesday October 30, 2002 @09:30PM (#4570048) Homepage Journal
    Graphics cards are expensive because they don't sell in large quantities. The supply of high-end video cards is low so price is high. The supply is low because demand is low. Very few people other than a few gamers per town has a geforce 4. Most people with dells and gateways have whatever old card comes in there. And a whole crapload of pre-built machines come with on-board video. And for the needs of the vast majority of people a TNT2 is more than they will ever need.

    My current PC is a Pentium III 450mhz with a TNT2 32MB video card. I bought this machine when the TNT2 first came out. There have been 4 geforce cards since then. And the only games that don't run on my computer are the new UT and America's Army. Every other 3D game runs just fine on my machine.
    I plan to buy a new PC soon. So I can encode movies faster and play Doom3. But I'm probably going to buy a motherboard that has the nforce2 chipset. Sure it's a crummy built in video card. But it's a geForce 4 built in. Even though the board probably wont be as fast as a KT400 with DDR400 and a video card in the AGP slot, it's a deal you can't beat. Motherboard, sound card, ethernet card, and video card for the price of just a motherboard. I probably wont use the built in sound card often, but all the operating systems I use fully support having 2 sound cards and using them simultanously, so I dont' see where I can go wrong.
    To read more about the nforce2 chipset check out
    Nvida [nvidia.com] or
    anandtech [anandtech.com]

    It wont make the fastest gaming machine, but it will still make a good enough one, for a low low price.
    • > Graphics cards are expensive because they don't sell in large quantities.

      Yeah, one graphics card per computer never did seem like a large enough quantity to me, either.

      </joke>
    • I'm probably going to buy a motherboard that has the nforce2 chipset...it's a geForce 4 built in.

      Don't get too excited, there. Only the nForce2 IGP has onboard video, and it's a GF4 MX, which is actually a souped up GF2--i.e. it's a DirectX 7 core, not the DX8 core from the GF4 Ti, or even the GF3. Even the NV18 (GF5 MX?) probably isn't going to support DX8.

      That said, I'm planning to use an nForce2 board in my next computer, but I can't decide whether to get an nForce2 IGP based one, or get one with the SPP version and drop some other card in the AGP slot. I'm leaning more towards the IGP, since I've also got an Xbox, but I'll have to see what the prices are like after Christmas.
  • Because people (gamers) are willing to pay that much.
  • One other reason the price is so high, is because all the gamers keep buying. If they can sell it at a high price, then they will. Same principal works for the McDonalds on the sides of highways in the middle of nowwhere...They can sell their value meals starting at $8 because people will pay that much. Simple supply and demand.
  • chek out here, http://osdn.pricegrabber.com/ [pricegrabber.com], you can ussually find amazing deals on here, and since its part of OSDN i'm sure the slashdot crowd would approve

  • I was always under the impression that video cards were cheap, depressingly so if you're a hardware maker. Considering the quick rate of improvement, the complexity of graphics technology, and the fabs required to make the chipsets, they seem pretty cheap compared to say, cpu's.

    Sure, the top of the line stuff has been $200 for a long time (now $300 or more). But within a 1/2 year, your top card goes down to $100 or less.

    That doesn't seem much for a cutting edge part of your PC.
  • by Raetsel ( 34442 ) on Wednesday October 30, 2002 @10:49PM (#4570486)

    First, the (GeForce 4 Ti | Radeon 9700 | Matrox Parhelia) chip is very complex. The Pentium 4 has about 55 million transistors on it. Compare that with (approximate numbers, of course):
    • 63 Million -- (GeForce 4 Ti)
    • 80 Million -- (Parhelia)
    • 110 Million -- (Radeon 9700)
    Damn hard chips to make, even if they're not running at GHz speeds.

    Now, about that memory... It's at least DDR in most cases (like my GeForce 4 Ti 4200), and runs at much higher speeds than motherboard RAM. 300 MHz (actual!), or "600 MHz DDR" in some cases. That's special stuff -- and expensive.

    You're putting 128 MB of that on an add-in card, as much memory for video as I had in my entire computer last year! (Damn...)

    Now, about those prices. A mid-range P-4 (2.4 GHz, 133 MHz QDR FSB) runs about $190. Top-of-the-line DDR memory isn't that bad, figure $75 for that part.

    190 + 75 = $ 265

    No, I don't think modern video card prices are out of line. As (enthusiasts | gamers) we're on the cutting edge, and it costs to be there.

    The scary part is that I'm very seriously considering an All-in-Wonder Radeon 9700 for the new computer I'm building my wife. I keep waiting to see what becomes of nVidia's NV30... but if I don't see anything by early December, I'm going with ATI.

    God, I'm a nut. Oh well, it drives the economy.

    • You won't be seeing the NV30 until next spring. Last I heard there were (NVIDIA acknowledged) problems with moving to the smaller manufacturing process on the new chip and they are/were getting pretty low yields. ATI made a smart decision (or perhaps just got lucky with timing on their release) by putting this off until the 9700 was out. Now they've got the top dog and plenty of time to milk it while at the same time preparing to make a similar process change on their next part.

      Most are predicting that even when it arrives the NV30 won't be a compelling upgrade over a Radeon 9700. Add in the fact that by the time it comes out ATI should have no problem bringing out a refresh of the 9700 that would at least keep it dead even with NV30 and ATI is sitting pretty.

      This isn't anything too bad or unexpected, though. The same thing happened to Intel for a while. They ran out of steam on the P3 and had to take a back seat performance-wise to the Athlon while they worked on getting their next gen part out. Now they've got the P4 that seems like it can scale to the moon and the current Athlon (XP|MP) series seems to be almost tapped out.

      You should just go ahead and get the Radeon if you're going to be upgrading or building a new system anytime soon. Besides, using your wallet to establish a true competitor to NVIDIA in the high end gaming arena (something they haven't had since well before the end of 3dfx) could only be a good thing down the road. We'll see more and faster innovation (perhaps part of the reason NVIDIA slipped from its legendary 6mo product cycle because it was too cocky in its dominance) and perhaps even lower prices.
  • There are expensive high-end graphics cards because people want them.

    The more money people pour into the market, the faster it will advance, and the better the cards you'll see.

    Now, here's the best part:
    You can buy an amazing value-line or 6-month-old card for a trifle, thanks partly to the drooling game fanatics who absolutely need those extra few frames per second and will pay whatever it takes.

    Look at other markets; they're really not much different.
    Pentium 4 2.8 is about $500.
    Celeron 2.0 is about $110.

    How different are they really?

    Have you ever paid $2000+ for a CD player? Many hi-fi people do.
  • Try being a Mac user (Score:4, Interesting)

    by dr00g911 ( 531736 ) on Thursday October 31, 2002 @02:17AM (#4570978)
    $299 up until 3 weeks ago. It's $229 now that the 9000 is out.

    An ATI Radeon 8500 OEM card for wintel cost $99. Non-OEMs cost about $129.

    The difference?

    Zilch. Zero. Fuck all.

    A sticker, a box, and just a few k on the flash rom. [eon.com.au]

    Not to plead the case of the poor-trod-upon-mac-nazi, but...

    nVidia Geforce 4 Ti dual-head for PC: $199

    nVidia Geforce 4 Ti dual-head for the Mac: $399 (as of today)

    I guess my point is that some of us have it worse than you might imagine.
    • Try being a Mac user

      Well, that is one of the prices that you would expect to pay using a hardware platform that is not as widely in use. If the situation were reversed, and Macs were the dominant computing platform, then the price situation would most likely be reversed as well.

      Not saying you should switch, or being critical, not at all. Just pointing out the factual reality of the platform decisions we make.

      • That wasn't exactly the point I was making.

        I realize that it takes additional money for Mac-side driver/firmware/software development.

        The point is that on all sides vendors are fixing the prices of their products -- (against the idea of 1.5% markup) -- in a lot of cases on the Mac side at 200% the price of the *exact*same*widget* on the wintel side. Far more markup than additional driver development should allow for.

        Hell, CompUSA is just as guilty of this as video card vendors. I can remember clearly when Warcraft II was released, and the Myst trilogy was rereleased. Hybrid Mac & PC in the same box, and the same exact packaging. $15 price difference depending on which "shelf" you bought it off of in the store. They may have had an after-the-fact "Made for Mac" sticker on the box. I don't remember.

        The next time you walk in your neighborhood superstore, do some comparison shopping between the Mac & PC sections for hardware bits (keyboards, mice, hard drives). In a lot of cases, you'll see the original SKU covered by a sticker with the "Mac Pricing" sku/barcode for widgets that are available in two different parts of the store.

        Back to my point -- since the widespread advent of the 'Net, a lot of Mac users ignore the "Mac Sections" of those stores for everything but (2-year-old and "premium" priced) games and either pick up the comparable widget from across the store, or order the stuff online. Because they know better.

        I'd imagine that it's skewing the sales figures pretty heavily and isn't showing a true picture of the platform distribution -- which in turn drives prices up again and gets products discontinued.

        Then (even further afield) you've got products like the Creative Mac Blaster -- which never worked and never will work, but we still paid a premium for.

        Make no mistake: value-add hardware is priced just under the average "threshold of pain" for the target market -- and there's a feeling in the industry that Mac users have even more cash to throw around than die-hard gamers.
        • It's an issue of logistics.

          The hardware is pretty much a constant cost between both platforms.

          Say it takes 2 really good wizards and a full team 3 months to implement the firmware for x86 (VESA mode, basically) and a simliar group of other developers to do the Open Firmware code. QA for the two will probably be simliar as well.

          So, you've got the same number of dollars going into the code on both sides. But one side is going to be shipped to something on the order of 10x more users. So they can afford to charge a tenth for it and not lose money on the development.

          They can either charge more for the mac version, or they can spread the cost out, which is rather crazy from business standpoint -- mac is, was, and always will be, a niche market (sad but true).

          Anyway, they overdo it a bit, but you really should expect to pay more for the mac versions of just about anything, because they have to recoup the cost of porting/QAing/etc on the platform.

          (All this is, of course, moot for anything standards-based, like USB devices. the price difference for standards-compliant USB devices makes me laugh =)
    • The difference?

      Zilch. Zero. Fuck all.


      So stop complaining and buy the PC version, flash the rom, and use it in your mac. I've got a flashed Radeon 8500 and a flashed adaptec SCSI card in my mac. The SCSI card was the real no-brainer. $100 instead of $369, and flashing it for use in a mac is a supported operation. All the tools are available on the Adaptec site and run in MacOS X.

      I think the high price on "Mac" stuff that's clearly identical to PC stuff (Drives, USB stuff, Memory, Many PCI cards) is just a "stupid" tax.
  • you know these cards cost big bucks to develop and tool for. that is what your paying for when the card is new. when the company begins to pay off its investment it can lower the price and resarts the cycle with another part.

    supply and demand also plays a role in it.

    this is why im just now considering buying my GF4ti4200, the price has finally become reasonable, ~220ish. than in a few years ill get my GF6 when that comes down too. the best way to save money is to live off of teh bleeding edge.

  • by cr0sh ( 43134 ) on Thursday October 31, 2002 @04:57PM (#4573951) Homepage
    ...and bear it.

    I have wondered the same thing myself, on just about every component of a computer system. I have wondered the same thing about laptops - why can't I buy a laptop with a cheap monochrome screen, or with a low-spec processor - and save a ton of money (ie, a laptop for $500). I can't, and there isn't much I can do about it because I am only one person - the market isn't there for such devices.

    What I can do, though, is buy used - instead of trying to stay on the bleeding edge, hang off the trailing edge, and know that it will all trickle down eventually.

    I can still buy just about any 486 or low-end Pentium laptop for pennies on the dollar of what it originally cost. Same with graphics cards, and other parts. Even whole systems are very cheap. Things that I thought I would never be able to afford can now be found for a fraction of what they sold a short while ago.

    For instance, the Spacetec Spaceball - a 3D input controller. Back in the mid-90's, you would have had to pay around $2000.00 for this device. Today, off of Ebay, one can be had for $20-25.00! Even the best model (made by HP, I think) only goes for about $200! Last year I purchased a professional level VR HMD for $250.00 - it used to retail for around $3000.00! I recently purchased a 28-bay CDNet cd server with 14 SCSI drives for $200.00 - not too many years ago these were selling for around $10,000! Finally, we have all seen the "free Cray to whoever will haul it away"-type deals on Ebay and elsewhere - these are super-computers we are talking about, things that ordinary people at one time couldn't even DREAM about owning, but I would bet there are a few people using them now in their basement (while the rest are "making do" with Beowulf style clusters). My work recently gave me a PII-300 and motherboard - not too long ago, do you have any idea what that would have cost me? Here it is being GIVEN to me, otherwise it was TRASH!

    When people throw out or darn near "give" away hardware, why bother staying on the "bleeding edge"? VERY FEW applications even require todays "mid-range" hardware - most people can get by with older equipment no-problem. I suspect that in a few years, unless something "great" comes along that we can't live without, we will see a massive decline in sales of hardware - because most people won't need it or want it. The other thing that makes the older hardware a great thing is that if things keep going like they are, what with DRM, etc - all of that old hardware will be worth $$$$ on the grey/black market - considering that the junk is being given or thrown out now, should such a situation exist with DRM, people might be throwing away a good investment, in a manner of speaking.

    Finally, I leave you with this - think about that $400 video card, think about what it is capable of. Then think about what that power would have cost you back in the mid-1990's. If you don't have a clue, look into the REAL top-of-the-line offerings by companies like Evans and Sutherland (ie, simulator graphics engines) - prepare to break out a cool $10,000 to $20,000 - for the LOW END. Then realize that this power will be available to the consumer for gaming and other tasks a mere 5-10 years from now...

  • I'm a workstation 3D user, and to say that there hasn't been a downward trend in the cost of 3D accelerators is in my mind, a mistake. Three, four years ago, you would have been hard pressed to get a workstation-class video card for less a couple thousand dollars.

    Then - 3D gaming really took off, and the game market began to drive down the cost of 3D acceleration. As a result, I'm now able to run a video card (nVidia Quadro4 750 XGL) that cost $600, which, without a market for faster and faster gaming cards, might still be a $2500 card. Granted - there are still multiple thousand dollar workstation cards out there.

    But if you really look at the performance of what you can get in a video card for less than $100, its remarkable. The rise of the GPU has totally reshaped the performance of video cards. Sure, you still have to fork out a couple hundred bucks for the latest and greatest. But holy crap. Look what you get anymore with the latest and greatest.

    I guess I just have little sympathy for people complaining about having to pay $150-$200 for some card when I am delighted to be able to get my high end card for $600. Stuff costs.

    back to my little virtual world . . . .
  • Does anybody know about how much laptop graphics cards cost? (like geforce2go etc). I guess that they should be more expensive than dektop pc cards, but how much more?
  • by WIAKywbfatw ( 307557 ) on Thursday October 31, 2002 @08:23PM (#4575336) Journal
    There is more silicon on an average graphics card nowadays (and has been for the last five years) than there is on the average CPU - comparing the price of the two it's clear that if anything, graphics cards are comparitively underpriced, not overpriced.

    The original poster seems to have made several assumptions and ignored a few basic truths.

    First of all, consoles are of fixed specification - today's PS2s are no more or no less powerful than the first units that shipped over two years ago. This lack of development gives the console makers three advantages:

    1. No further research and development costs.

    Develop a single product one year, reap the
    benefits for the next five. Try doing that with a graphics chipset. Five years ago, the 3dfx Voodoo chipset was the hottest thing out there; seen any system builders using that graphics engine recently?

    Even the R&D costs for the graphics engine isn't something that the console makers truly have to worry about, as those are borne by the chipset manufacturers - who then fall over each other trying to secure supply agreements with Sony, etc.

    2. Better fabrication and higher yields lead to lower costs.

    Over time, graphics chips (and all other silicon) becomes cheaper to make, thanks to better fab plants and higher yields. A factory that initially churns out x chips a month might well be churning out 10x chips a month a year down the line, and at little extra cost. A newer factory will improve on that too.

    3. The console makers don't have to worry about compatibility issues.

    Every product that's aimed at the PC market space has to be thoroughly tested with a wide range of hardware and software to make sure that it works error-free. They will even need third party approval before they can ship (Microsoft Windows approved certification, FCC, CE testing, etc). Graphics cards are no exception to this rule.

    Additionally, consoles are a lot easier to support.

    If FIFA 2003 doesn't work in your PS2, then either your CD is faulty or your machine is, and it's not going to take more than five minutes to work out which is the problem. But if FIFA 2003 doesn't work out of the box on your PC then you've got a whole lot of work ahead of you before you can safely say what's at fault. Grahpics card (and other hardware and software) manufacturers have to support users in this situation - Sony, etc do not. Support costs money.

    Also, early adoption has its price. Buy a PS2 as soon as it's launched and you know that it will cost you more than it would a year or even six months later. The same is true of state of the art graphics cards. PS2s cost less than half of what they did originally. But so do the graphics cards that shipped at the same time as the PS2 was launched.

    Lastly, graphics card manufacturers have to turn a profit on every card they make. Console manufacturers don't have that issue - they'll happily loose money on the hardware and make it back on the software you'll be buying.

    If you buy FIFA 2003 on the PC then EA makes money but nVidia, ATi, etc do not. If you buy FIFA 2003 on the PS2 then Sony, the people who make the console but don't make the game, do make a profit. It's a entirely different business model - something that you've failed to appreciate.

  • As anybody with a good graphic feeling can tell you: there is a difference in quality between one graphic card and the other - even if they have the same graphic processor. This is due to differences in other - often analog - parts of a graphic card.

    As we all see colors a bit different (not just the color blind people), it is impossible to say absolutely that one card is better than the other.

    This creates the climate where branding is important. Sure, the expensive brands tend to use better parts and have on average a bit superior quality. But you pay also for the cost of advertising and being first to market.
  • John Carmack

"Experience has proved that some people indeed know everything." -- Russell Baker

Working...