Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Graphics Software Hardware

2D vs 3D Performance in Today's Video Cards? 71

CliffH asks: "Has anyone else noticed a serious decline in 2D quality versus 3D quality in video cards? I routinely work on older systems right beside newer systems on the same monitor (Dell P1110) and it becomes blaringly obvious to me that 2D quality is starting to take a backseat to 3D quality. For example, my main system is a dual-boot Shuttle XPC SS51G with an added GeForce2MX 400 card in for the times I do want to play some games. A little, nasty, ready to be thrown away system I have on my bench at the moment is a K62-500 with my favorite card of all time, a Matrox Millenium II 4MB job in it. The 2D quality between the two is just shocking. Where the Matrox is nice, crisp, extremely easy to read at 1280x1024, the GeForce2 is kind of blurry, not as well defined, and the colors aren't as vibrant. I would be skeptical if this were the only newer card I have seen with the results, but it has gone through the GeForce line (last one I tested was an MSI branded 5900 Ultra) and a small handful of ATI Radeons with similar results. So, the question stands. Am I going nuts or has there been a definate tradeoff between 2D and 3D quality in recent years?"
This discussion has been archived. No new comments can be posted.

2D vs 3D Performance in Today's Video Cards?

Comments Filter:
  • The money is in improving 3D quality. 2D isn't important to the average end-user any more.
    • OP: Your answer. (Score:5, Informative)

      by Glonoinha ( 587375 ) on Thursday December 11, 2003 @02:23PM (#7691611) Journal
      Actually you are running into a (not so quite) well known issue with the GeForce cards that has been addressed here :

      How to fix a fuzzy GeForce card [a7vtroubleshooting.com]

      Get out your soldering iron and you can get a crystal clear display on your GF2 while voiding the warranty and pissing off the FCC at the same time.
      • damn the link off that site to the real info is dead. anyone know where else to find it ??

      • Unfortunately, the link from the page you list (the one with the actual how-to info) is 404.
        Any idea of a mirror of this, most valuable, information?

        I have a GeForce2MX that I've not considered being the problem... I thought it was my monitor. I've messed with the contrast and brightness an insane amount in an attempt to get it to look decent with no luck.

        Thanks!
        • No clue, I just saw it one day and bookmarked it.

          Check Google for geforce sharp fix remove and a few other choice words - I'm sure the info still exists.

          • Tried already, closest I came was this page [maxuk.net] which, unfortunately, lacked the necessary images for my card (GTS, not MX as I mis-stated above). Also tried emailing porotuner@yahoo.com, but it bounced.

            Maybe someone else out there has a copy? (please post 'em!)

            Maybe it's time to switch to a Radeon anyways... the severe instability of the NVidia drivers has left me using the open-source drivers (with no 3D/GL).
    • The average end-user is very concerned with 2D quality. Reading email, browsing the web, creating presentations, writing documentation, etc, etc. All are 2D applications.

      The only people who care about 3D are gamers. They're a minority. A small minority. Much more significant are the non-gamer home users and businesses.
    • Across the board, over the last two years, 2D quality has largely gone up, on average. Yes, comparing a cheaply manufactured nVIDIA card (like an older GF2MX) to a really high-end older card (like a Matrox) and you'll get huge difference. Compare a more recent ATI-branded Radeon 9600 or a GeForce FX 5X00 from a good company (Gainward, Leadtek, PNY, etc.) to a Matrox Parhelia, and there's not a whole hell of a lot of difference. Both nVIDIA and ATI have upped the minimum standards the card manufacturers can
      • -Besides, 2D output quality is going to be largely unimportant when we all finally switch to DVI.

        Genuine question about DVI : I have a flat panel display with DVI in and VGA in, two machines with DVI and VGA out, and my current setup is a Linksys two port KVM switch that allows both machines to sit behind a single set of controls (keyboard, mouse, and display via the VGA connection.) My current rig works nicely but the thought of DVI razor sharp images on this display (Dell 18" LCD) is tangibly interestin
        • Are you aware of a splitter (KVM switch, two port, preferably cheap) that uses DVI instead of VGA?

          Do they exist? Yes. Can you get one cheap? Hell no!

          Belkin makes a range of such devices [belkin.com], with USB inputs for keyboard/mouse. The 2-port model MSRP's for $245, and each of the cables (you'll need to buy two) are $80. So that's a whopping $405 + S&H for this solution. You can probably shave $100 off the top by buying from 3rd-party retailer, but that's still a lot of cash.

          Please note, I have zero experien
  • Monitor (Score:1, Insightful)

    by nempo ( 325296 )
    Are you using the same monitor for both systems, I mean, not just the same model but physically the same monitor ?
    • Re:Monitor (Score:2, Informative)

      by mc_barron ( 546164 )
      RTFP:
      "I routinely work on older systems right beside newer systems on the same monitor (Dell P1110) and it becomes blaringly obvious"
      (emphasis added)
    • Physically the same monitor. It has dual inputs (one of the reasons I bought it) so that I can run two systems on the same monitor. Takes up a bit of room on the desk but it does eliminate the need for two monitors.
      • Ah. So, are the inputs both VGA or DVI, or is one VGA and the other DVI? If it's 1 VGA/1 DVI, and your Matrox is a DVI card (I don't know if DVI was available then), then that explains it.
      • Re:Monitor (Score:3, Informative)

        by drsmithy ( 35869 )
        Physically the same monitor. It has dual inputs (one of the reasons I bought it) so that I can run two systems on the same monitor. Takes up a bit of room on the desk but it does eliminate the need for two monitors.

        Try switching the cables around. I had the same problem and assumed it was just the crappy card in the other PC, but it turned out to be the cable.

        Although after swapping, the machine with the Matrox card in it was still a better picture, but the difference was nowhere near as distinct.

  • "Pretty" Sells (Score:2, Insightful)

    by JonoPlop ( 626887 )
    Unfortunately, 3D is now the main consideration. People benchmark by 3D scores. Graphics card boxes show pictures of 3D games, and the company demos 3D applications of the card. This is simply because they look pretty; imagine showing a picture of a normal desktop on a 3D card, or just showing normal desktop 2D usage at a trade show demo - it wouldn't draw any attention.
    • Re:"Pretty" Sells (Score:3, Insightful)

      by ctr2sprt ( 574731 )
      Er, no, it's not "simply because [3D] look[s] pretty." It's because 3D performance is what people want. I'm sure NVIDIA et al. could make cards that look better than that old Matrox, but I'm just as sure it would add $50 or so to the card's price. Most people, myself included, would just not be willing to pay for it. One solution would be to keep the card's cost constant by reducing 3D performance, shipping it with less RAM, etc., but I hope you can see why that wouldn't work.

      This is not a case of gra

      • Er, no, it's not "simply because [3D] look[s] pretty." It's because 3D performance is what people want.

        Err, no, they don't - at least not the vast majority of people. Most people couldn't even tell if the 3D part of their graphics card didn't work. Well, maybe if they use some fancy screen saver.

        • The "vast majority" you're talking about don't buy video cards, they buy computers with built-in video cards. They are also not the driving force behind graphics technology innovation and improvement. Video card manufacturers make no money off them. They are an utterly uninteresting group when discussing aftermarket or high-performance video cards.
          • Right, they buy computers that come with graphics cards - that have 3D accelleration "because most people want it" -nope, they don't care, they usually would rather have decent image quality. This discussion isn't about aftermarket GCs, it's about GCs, period.
  • Matrox is an outlier (Score:5, Interesting)

    by the quick brown fox ( 681969 ) on Thursday December 11, 2003 @01:39PM (#7691141)
    No, Matrox was just always way out in front in terms of 2D image quality. That was their chief advantage for a while, after everyone else surpassed them in performance.

    I have the impression that within the last 12 or 18 months, 2D image quality has become a priority (maybe a prerequisite) among enthusiasts again. In any case, I recently got an nVidia FX 5200 card (I think the vendor is eVGA) and the quality is superb on my 19" Sony Trinitron--better than the Matrox G400 I used to use.


    • My nVidia MX440 cards are as good as the Matrox G400 cards. I had problems with the Matrox cards dying. Two are sitting on my desk now.

      This looks like the best 2D video card deal now: eVGA.com MX440 [google.com]. I use the same chipset from other nVidia manufacturers, but I'm getting ready to order eVGA.

      The key issue is RAMDAC speed, which is 350 MHz for the MX440. That's high enough for crisp 1600 x 1200.
  • DVI (Score:5, Insightful)

    by zsazsa ( 141679 ) on Thursday December 11, 2003 @01:40PM (#7691155) Homepage
    You're not crazy. If you buy a Matrox Parhelia, it'll look a lot better on a CRT than a GeForce. GeForce boards' analog sections are made to lower quality specifications than Matroxes, hence the cheaper price. If you want crisp 2D on a CRT, you're going to have to pay, just like how you paid for your old Matrox -- I'm sure it wasn't cheap when it was new.

    If you want crispness with GeForces (or Radeons), go DVI with an LCD monitor. Since it's all digital, there'll be no degradation.
    • That is going to be my next step. As soon as I can sell off this 21" I'm going to move to a 17" LCD (again, with dual inputs so I only need one display for my workbench). I'll use the DVI input for the main system and analog for the customer systems. Should work out ok. :)
    • My XFX Geforce 5900 has pretty lousy 2D on the HD15 port but the DVI port using the analog converter supplied with the card looks great, very sharp and crisp.
      • I see the reverse issue with my Radeon 9200 - brilliant display on the VGA connector, shocking on the DVI+adapter secondary output. I've resorted to not using that output at all, and sticking a PCI TNT2 back in to drive my second monitor. The TNT2 looks noticeably worse at times than the Radeon's primary output, but it's a lot better the secondary output. I may end up paying through the nose for a Matrox card next time I upgrade my graphics card - which probably won't be for a few years - because I have
    • If you want crisp 2D on a CRT, you're going to have to pay, just like how you paid for your old Matrox -- I'm sure it wasn't cheap when it was new.

      The Matrox products are horrendously expensive. The lowest-price 128MB Parhelia I found was CAN$465 (OEM) versus CAN$97 for the lowest-price 128MB GeForce FX. Sure, the Matrox can do triple display, has specialized support/plugins [matrox.com] for various software and undoubtedly better 2D performance but is that worth a few extra hundred bucks? For some, I guess it is.
  • by Fluffy the Cat ( 29157 ) on Thursday December 11, 2003 @01:44PM (#7691194) Homepage
    To a large extent, 2D quality will depend on the quality of the digital/analog conversion on the graphics card. Traditionally, Matroxes were very good in this respect - I've no idea if this is still true, but given their target market I'd guess so. Nowadays, with large numbers of different companies making basically identical cards, one avenue for cost reduction (and so seeming more appealing than your competitors) is to use cheaper parts in this area, with corresponding reduction in quality. As a result, it's quite possible that one Radeon may have dreadful 2D quality while an almost identical looking card using the same chipset from another manufacturer may have decent 2D quality.

    Of course, this all becomes much less of an issue once you start using DVI or some other purely digital connection (in laptops, the limiting factor in 2D quality is generally the screen)
  • by jamessan ( 244547 ) <vega...james@@@gmail...com> on Thursday December 11, 2003 @01:45PM (#7691199)
    The newer cards that you mentioned are mainly geared towards 3D performance because their biggest market (the gamers) want the utmost 3D performace they can get. If you want good 2D performance, you'd probably be better off buying the cards that are geared towards the workstation market (like nVidia's Quadro NVS [nvidia.com] line) instead of the gaming market. Also, as has been mentioned, DVI-capable cards and an LCD will give an improvement.
    • <meToo>I have to agree with the parent. My primary work machine has a Dual display Quadro and has great crispness and good general video performance. It helps when you're staring at .Net / Java code for 10 hrs a day. </meToo>
  • by pagercam2 ( 533686 ) on Thursday December 11, 2003 @01:52PM (#7691275)
    It sounds more like the problem is not 2D but basic signal quality. 2D refers to drawing boxes on screen without the coordinate transforms and shading involved in 3D. Clarity of text is more an issue with signal quailty between the video card and the display with the possible exception of some anti aliasing that helps smooth 3D graphics but blurs text. Quality of the analog drivers, using the correct signal levels, sheilded cables so that parts of the green signal don't become part of the red (or which ever combination) and termination on the signal many times severe ghosting can be seen with copies of images or text a few inches to the right of the image. There is no reason other than anti alias filtering that text will look any more blurry from a 3D card as compared to 2D. As clock rates get faster the quaility of the signal becomes a bigger issue and the standard connectors and cables should be improved but have basicly remained the same since the 80's. Is your monitor really capable of the refresh rates? Newer cards have improved update rates but those higher speeds are near the limits of older monitors.
  • by Tom7 ( 102298 ) on Thursday December 11, 2003 @01:53PM (#7691280) Homepage Journal
    Switch to LCD and DVI. The monitor cable you use to connect to your card, if analog, can also make quite a big difference.

    Also, "definite" is not spelled with an 'a.' Think, "finite."
    • That's a good start, but it's not enough.

      I have an 18" 1280x1024 LCD with both analog VGA and DVI inputs. It looked awful with DVI and a GeForce mumble (can't remember the exact model number). It looks really good hooked up via the VGA cable to an old G400.

      • Really? I don't understand how this is possible. With DVI, the graphics card should not affect the picture quality.

        Also, I can't imagine anyone prefering analog video to digital, since you start with a digital signal (the pixels on the screen) and the conversion to analog and then back to digital is always lossy.
        • by PapaZit ( 33585 ) on Thursday December 11, 2003 @04:01PM (#7692630)
          You're right: it's weird, but the difference that I saw was not subtle. It was "unwatchable" vs. "good". Some things that I can think of:

          -Crappy DVI cable

          -Analog VGA is boring, well-known, and well-tested. It's easy to make hardware that cleans up a VGA signal. Perhaps the monitor cleans up the VGA signal, but passes through the DVI signal unadultered.

          -VGA (like most analog tech) degrades more gracefully. A few missed or altered bits in the DVI signal cause more obvious artifacts.
          • ...but passes through the DVI signal unadultered.

            Well, that's kind of the point, right? ;)

            Anyway, correctly working DVI-D should always look better than analog. It might be broken, yeah. It may be as another poster suggested: DVI cables can also carry analog signals, and it's possible that your video card fell back to analog because it couldn't support digital DVI at that resolution.
        • by 4of12 ( 97621 ) on Thursday December 11, 2003 @04:27PM (#7692901) Homepage Journal

          I don't understand how this is possible. With DVI, the graphics card should not affect the picture quality.

          Don't forget [directron.com] that a DVI-I connector can piggyback a DVI-A, you know, analog signal that used to be a VGA.

          The DVI-D digital part is what you want from your video card and being interpreted by your monitor.

          That was one chunklet of information that I needed to learn in my migration to DVI.

          The other important chunk of information was that The One Cool Number was no longer RAMDAC frequency. (I used to run a Viewsonic P815.)

          Now you want a video card capable of high frequency TMDS to be able to drive high resolution digital monitors. [xlr8yourmac.com]

          Perhaps these days more video cards can support high resolutions easily, but a couple of years ago I had to carefully look at the video cards to see if they could drive my Samsung 240T (1920x1200).

    • by Anonymous Coward
      Not if you are a Dell laptop customer.

      They only offer DVI out on their WXGA/WUXGA laptops aimed at the Dmedia and gaming crowd. On their machines that use industry standard 4:3 monitors (like 1600x1200 I've used for 4+years), you are hung out to dry -- the laptops
      (some of which have higher specs -- faster bus, memory, threaded P4), they
      botched it and provide no DVI output -- VGA only. If you want DVI out, you
      have to find a new LCD that supports the ~1600x1024 res (no, you can't program the video chip to
  • if DVI LCDs are the way to go, then apple had it right even when they were making CRTs...
    seriously, go find someone that still has an older G4 tower and a real apple CRT with the apple digital connection thingy...

    the image quality on these things is outstanding, mainly because its just digital, digital, digital right up until the images hit your eyes...

    mike
  • Does anyone even review 2D quality any more? Personally, 2D matters to me a lot more than 3D, but I can't find anywhere that reviews 2D quality. The only places that seriously review cards are gaming sites. They're mostly concerned about how many frames-per-second you get in the first-person-shooter du jour.

    I've taken to visiting people and looking at their video quality. It works okay, but there are a lot of holes in my information. I don't know what a Parhelia looks like because I don't know anyone
    • There is a review of "professional" graphic cards in the latest issue of the German magazine "c't", including detailed data on image quality. I don't know about the computer magazine market in other countries, but I think c't is the only "readable" mag here in Germany (nothing but badly disguised advertising in other mags).
    • Actually this subject was covered at BYTE.com a year or so ago. They suggest that the ATI boards are better at text than the nVidia.
  • Matrox has great 2D (Score:1, Interesting)

    by Anonymous Coward
    when i bought a vid card for my last system, i wasn't really interested in playing games. Most of my stuff just relies on 2D. So I dropped 125$ on a Matrox G550. That card had decent 3D and unbelievable 2D. I have since sold that system and the recipient has the same results. Matrox makes fantastic 2D .. and I hope they continue. Perhaps they should even enter the mobile market and make some great looking laptops.
  • Wrong conclusion (Score:5, Informative)

    by NanoGator ( 522640 ) on Thursday December 11, 2003 @02:39PM (#7691792) Homepage Journal
    "The 2D quality between the two is just shocking. Where the Matrox is nice, crisp, extremely easy to read at 1280x1024, the GeForce2 is kind of blurry, not as well defined, and the colors aren't as vibrant."

    Geforce 2 cards were known for having cheap-o filters that weakened the analog signal to the monitor. It has nothing to do with 2d vs 3d quality, you bought a card with cheap parts in it. (I did too, that's why I know this.)

    Recent Geforce Cards are a lot better. I have dual monitors running at 1600 by 1200, quite clear and readable.
  • by Gadzinka ( 256729 ) <rrw@hell.pl> on Thursday December 11, 2003 @02:44PM (#7691840) Journal
    Your observations are quite correct. Today's videocards are total crap when it comes to picture quality.

    There is only one manufacturer with very good picture quality and bearable price -- Matrox. But they are either very slow when it comes to 3D graphics (g400/450/550) or quite expensive (Parhelia). And not too fast either.

    ATI (the ATI brand, not the OEM products like powercolor etc) is a little bit worse, but still bearable.

    All NVidia cards are total crap, no matter if you chose several years old or top of the line for $400.

    I think the consumers are guilty, because they buy more FPS ignoring actual picture quality. Vendors just give people what they want.

    Robert

    PS. I still use Matrox G400DH because I spend >12h/d in front of the monitor. I swaped Matrox G450 for it because it has better supported tvout under Linux, so my workstation doubles as multimedia center connected to my tv-set.
    • FYI, ATI branded cards are made by Sapphire Technology, and are pretty much identical to the Sapphire branded cards. The ATI ones just get a different (inferior) cooler, and get put in a different box with a different software bundle.
  • by JonBob ( 556956 ) on Thursday December 11, 2003 @03:09PM (#7692083)
    ...unless you get a dedicated card for the task, like the ATI Radeon 9500 ASC [bbspot.com].
  • When I was last shopping for video cards, I decided to go for an nvidia gf4 ti4400. I managed to find a review of several different manufacturers of them (PNY, ASUS, gainward, etc) that was mostly comparing fan noise and vga signal quality (since the 3d was all within like .5% of each other since they're all basically copies of the reference design). Evidently the RAMDAC (the guy who makes an analog signal) can vary pretty significantly from vendor to vendor. Something to look out for.
  • Refresh Rate? (Score:3, Interesting)

    by djohnsto ( 133220 ) <dan.e.johnston@g ... inus threevowels> on Thursday December 11, 2003 @03:11PM (#7692111) Homepage
    Newer video cards have upped the RAMDAC speed to around 350MHz. That Matrox card probably only supports 200MHz or so. So the question is: what whas the refresh rate at 1280x1024? Decreasing the refresh rate often increases the sharpness. Also, as noted elsewhere, a lot of the GeForce line is pretty bad for 2D quality.
    • Re:Refresh Rate? (Score:4, Informative)

      by styrotech ( 136124 ) on Thursday December 11, 2003 @04:16PM (#7692763)
      Newer video cards have upped the RAMDAC speed to around 350MHz. That Matrox card probably only supports 200MHz or so. So the question is: what whas the refresh rate at 1280x1024? Decreasing the refresh rate often increases the sharpness. Also, as noted elsewhere, a lot of the GeForce line is pretty bad for 2D quality.

      Haven't you got that backwards? All else being equal, a faster RAMDAC will mean a sharper picture. The higher the frequencies, the closer you can simulate sharp corners. The same way (but going the other way) that sampling at low freqeuncies gives a poor waveform.

      But you still need quality components and engineering to avoid squandering that advantage. And a lot of video cards do a worse job with a 350MHz RAMDAC than old Matrox cards could do with a 220MHz (from memory) one.

      While it is better, a faster RAMDAC usually can't substitute for a crappier one.
      • Haven't you got that backwards? All else being equal, a faster RAMDAC will mean a sharper picture. The higher the frequencies, the closer you can simulate sharp corners. The same way (but going the other way) that sampling at low freqeuncies gives a poor waveform.

        No, I don't. As you increase the refresh rate, more data needs to be sent over the cable, increasing the frequency of the signals sent over the cable. As the frequency increases, the signal is more susceptable to noise, decreasing the fidelity

      • Re:Refresh Rate? (Score:4, Informative)

        by djohnsto ( 133220 ) <dan.e.johnston@g ... inus threevowels> on Thursday December 11, 2003 @05:24PM (#7693698) Homepage
        Haven't you got that backwards? All else being equal, a faster RAMDAC will mean a sharper picture. The higher the frequencies, the closer you can simulate sharp corners. The same way (but going the other way) that sampling at low freqeuncies gives a poor waveform.

        Ahh, I finally realized how you're thinking ... unfortunately, it's wrong. ;) A RAMDAC is a digital to analog converter, therefore it does NOT sample the way you're thinking of it. It will take each pixel, convert the rgb digital values to an analog voltage and drive the signals out the VGA cable. As the refresh rate increases, these voltages representing individual pixels have to change quicker, increasing the frequency. A faster RAMDAC can therefore drive a higher frequency signal over the cable. However, the freqency of the signal to the monitor is only dependent on resolution and refresh rate, not the maximum speed of the RAMDAC. A higher speed RAMDAC will let you increase the resolution and refresh rate, not (necessarily) increase quality of lower bandwidth modes.

  • I have a Belkin 2-port KVM switch and I noticed the degradation with my old Matrox G400. Same for my Leadtek GeForce4 Ti4200 and ATI Radeon 9800 AIW cards.

    Are there any higher-end KVM switches that are low in price to mantain the clarity?
  • Matrox vs Nvidia is pretty obvious, i've heard Nvidia scrimps on everything but 3d, but how is 3DFX compaire on image quality?

    I'm still working on a 3DFX Banshee here (voodoo 2.5) My only other cards are voodoo3's, and a card so old that its unsupported in linux(but not win98(!?)).

    • "I'm still working on a 3DFX Banshee here (voodoo 2.5) My only other cards are voodoo3's, and a card so old that its unsupported in linux(but not win98(!?))."

      Not sure where you get your info, but Linux doesn't really support anything other than the basics (eg: VGA).
      I think you mean XFree86... and yes, the Voodoo3 is well supported [xfree86.org] (it uses the exact same driver as the Banshee, which I also own) and has been for a very long time (note the reference to 3.3.6).
      • Slip of the tounge, i meant X. Also I said "and a card so old" meaning a card other than the 3DFX's, trident or somthing. Its soo old the bios screen takes a few seconds to draw :)

        The point of the post being the 2D quality of 3DFX verses Nvidia btw.

UNIX is hot. It's more than hot. It's steaming. It's quicksilver lightning with a laserbeam kicker. -- Michael Jay Tucker

Working...