2D vs 3D Performance in Today's Video Cards? 71
CliffH asks: "Has anyone else noticed a serious decline in 2D quality versus 3D quality in video cards? I routinely work on older systems right beside newer systems on the same monitor (Dell P1110) and it becomes blaringly obvious to me that 2D quality is starting to take a backseat to 3D quality. For example, my main system is a dual-boot Shuttle XPC SS51G with an added GeForce2MX 400 card in for the times I do want to play some games. A little, nasty, ready to be thrown away system I have on my bench at the moment is a K62-500 with my favorite card of all time, a Matrox Millenium II 4MB job in it. The 2D quality between the two is just shocking. Where the Matrox is nice, crisp, extremely easy to read at 1280x1024, the GeForce2 is kind of blurry, not as well defined, and the colors aren't as vibrant. I would be skeptical if this were the only newer card I have seen with the results, but it has gone through the GeForce line (last one I tested was an MSI branded 5900 Ultra) and a small handful of ATI Radeons with similar results. So, the question stands. Am I going nuts or has there been a definate tradeoff between 2D and 3D quality in recent years?"
Well-documented over the years (Score:2, Insightful)
OP: Your answer. (Score:5, Informative)
How to fix a fuzzy GeForce card [a7vtroubleshooting.com]
Get out your soldering iron and you can get a crystal clear display on your GF2 while voiding the warranty and pissing off the FCC at the same time.
Re:OP: Your answer. (Score:1)
Re:OP: Your answer. (Score:2)
Unfortunately, the link from the page you list (the one with the actual how-to info) is 404.
Any idea of a mirror of this, most valuable, information?
I have a GeForce2MX that I've not considered being the problem... I thought it was my monitor. I've messed with the contrast and brightness an insane amount in an attempt to get it to look decent with no luck.
Thanks!
Re:OP: Your answer. (Score:2)
Check Google for geforce sharp fix remove and a few other choice words - I'm sure the info still exists.
Re:OP: Your answer. (Score:2)
Tried already, closest I came was this page [maxuk.net] which, unfortunately, lacked the necessary images for my card (GTS, not MX as I mis-stated above). Also tried emailing porotuner@yahoo.com, but it bounced.
Maybe someone else out there has a copy? (please post 'em!)
Maybe it's time to switch to a Radeon anyways... the severe instability of the NVidia drivers has left me using the open-source drivers (with no 3D/GL).
Re:Well-documented over the years (Score:3, Insightful)
The only people who care about 3D are gamers. They're a minority. A small minority. Much more significant are the non-gamer home users and businesses.
Re:Well-documented over the years (Score:3, Informative)
Re:Well-documented over the years (Score:2)
Genuine question about DVI : I have a flat panel display with DVI in and VGA in, two machines with DVI and VGA out, and my current setup is a Linksys two port KVM switch that allows both machines to sit behind a single set of controls (keyboard, mouse, and display via the VGA connection.) My current rig works nicely but the thought of DVI razor sharp images on this display (Dell 18" LCD) is tangibly interestin
Re:Well-documented over the years (Score:1)
Do they exist? Yes. Can you get one cheap? Hell no!
Belkin makes a range of such devices [belkin.com], with USB inputs for keyboard/mouse. The 2-port model MSRP's for $245, and each of the cables (you'll need to buy two) are $80. So that's a whopping $405 + S&H for this solution. You can probably shave $100 off the top by buying from 3rd-party retailer, but that's still a lot of cash.
Please note, I have zero experien
Monitor (Score:1, Insightful)
Re:Monitor (Score:2, Informative)
"I routinely work on older systems right beside newer systems on the same monitor (Dell P1110) and it becomes blaringly obvious"
(emphasis added)
Re:Monitor (Score:1)
Re:Monitor (Score:2)
Re:Monitor (Score:3, Informative)
Try switching the cables around. I had the same problem and assumed it was just the crappy card in the other PC, but it turned out to be the cable.
Although after swapping, the machine with the Matrox card in it was still a better picture, but the difference was nowhere near as distinct.
Re:nuts (Score:1)
"Pretty" Sells (Score:2, Insightful)
Re:"Pretty" Sells (Score:3, Insightful)
This is not a case of gra
Re:"Pretty" Sells (Score:2)
Err, no, they don't - at least not the vast majority of people. Most people couldn't even tell if the 3D part of their graphics card didn't work. Well, maybe if they use some fancy screen saver.
Re:"Pretty" Sells (Score:2)
Re:"Pretty" Sells (Score:2)
Matrox is an outlier (Score:5, Interesting)
I have the impression that within the last 12 or 18 months, 2D image quality has become a priority (maybe a prerequisite) among enthusiasts again. In any case, I recently got an nVidia FX 5200 card (I think the vendor is eVGA) and the quality is superb on my 19" Sony Trinitron--better than the Matrox G400 I used to use.
nVidia MX440 cards are as good as the Matrox G400 (Score:3, Informative)
My nVidia MX440 cards are as good as the Matrox G400 cards. I had problems with the Matrox cards dying. Two are sitting on my desk now.
This looks like the best 2D video card deal now: eVGA.com MX440 [google.com]. I use the same chipset from other nVidia manufacturers, but I'm getting ready to order eVGA.
The key issue is RAMDAC speed, which is 350 MHz for the MX440. That's high enough for crisp 1600 x 1200.
DVI (Score:5, Insightful)
If you want crispness with GeForces (or Radeons), go DVI with an LCD monitor. Since it's all digital, there'll be no degradation.
Re:DVI (Score:1)
Re:DVI (Score:2)
Re:DVI (Score:1)
Parhelia vs. GeForce (Score:2)
The Matrox products are horrendously expensive. The lowest-price 128MB Parhelia I found was CAN$465 (OEM) versus CAN$97 for the lowest-price 128MB GeForce FX. Sure, the Matrox can do triple display, has specialized support/plugins [matrox.com] for various software and undoubtedly better 2D performance but is that worth a few extra hundred bucks? For some, I guess it is.
Different manufacturers (Score:5, Informative)
Of course, this all becomes much less of an issue once you start using DVI or some other purely digital connection (in laptops, the limiting factor in 2D quality is generally the screen)
Change in Market Focus (Score:4, Informative)
Re:Change in Market Focus (Score:2)
Analog signal quality (Score:5, Informative)
Use an LCD for crisp 2D. (Score:5, Interesting)
Also, "definite" is not spelled with an 'a.' Think, "finite."
Re:Use an LCD for crisp 2D. (Score:2)
I have an 18" 1280x1024 LCD with both analog VGA and DVI inputs. It looked awful with DVI and a GeForce mumble (can't remember the exact model number). It looks really good hooked up via the VGA cable to an old G400.
Re:Use an LCD for crisp 2D. (Score:2)
Really? I don't understand how this is possible. With DVI, the graphics card should not affect the picture quality.
Also, I can't imagine anyone prefering analog video to digital, since you start with a digital signal (the pixels on the screen) and the conversion to analog and then back to digital is always lossy.
Re:Use an LCD for crisp 2D. (Score:4, Interesting)
-Crappy DVI cable
-Analog VGA is boring, well-known, and well-tested. It's easy to make hardware that cleans up a VGA signal. Perhaps the monitor cleans up the VGA signal, but passes through the DVI signal unadultered.
-VGA (like most analog tech) degrades more gracefully. A few missed or altered bits in the DVI signal cause more obvious artifacts.
Re:Use an LCD for crisp 2D. (Score:1)
Well, that's kind of the point, right?
Anyway, correctly working DVI-D should always look better than analog. It might be broken, yeah. It may be as another poster suggested: DVI cables can also carry analog signals, and it's possible that your video card fell back to analog because it couldn't support digital DVI at that resolution.
Re:Use an LCD for crisp 2D. (Score:4, Interesting)
I don't understand how this is possible. With DVI, the graphics card should not affect the picture quality.
Don't forget [directron.com] that a DVI-I connector can piggyback a DVI-A, you know, analog signal that used to be a VGA.
The DVI-D digital part is what you want from your video card and being interpreted by your monitor.
That was one chunklet of information that I needed to learn in my migration to DVI.
The other important chunk of information was that The One Cool Number was no longer RAMDAC frequency. (I used to run a Viewsonic P815.)
Now you want a video card capable of high frequency TMDS to be able to drive high resolution digital monitors. [xlr8yourmac.com]
Perhaps these days more video cards can support high resolutions easily, but a couple of years ago I had to carefully look at the video cards to see if they could drive my Samsung 240T (1920x1200).
Re:Use an LCD for crisp 2D. (Score:1)
Okay, I meant DVI-D. Using DVI for analog is pretty sad.
Re:Use an LCD for crisp 2D. (laptop trends) (Score:1, Insightful)
They only offer DVI out on their WXGA/WUXGA laptops aimed at the Dmedia and gaming crowd. On their machines that use industry standard 4:3 monitors (like 1600x1200 I've used for 4+years), you are hung out to dry -- the laptops
(some of which have higher specs -- faster bus, memory, threaded P4), they
botched it and provide no DVI output -- VGA only. If you want DVI out, you
have to find a new LCD that supports the ~1600x1024 res (no, you can't program the video chip to
Apple CRTs, for comparison (Score:1)
seriously, go find someone that still has an older G4 tower and a real apple CRT with the apple digital connection thingy...
the image quality on these things is outstanding, mainly because its just digital, digital, digital right up until the images hit your eyes...
mike
Re:Apple CRTs, for comparison (Score:3, Insightful)
Uh. How is a cathode ray being steered by a magnetic field digital?
Re:Apple CRTs, for comparison (Score:1)
the magnificent magnetic field steers the pesky photons by flicking its middle digit at them.
How to look at 2D quality? (Score:2)
I've taken to visiting people and looking at their video quality. It works okay, but there are a lot of holes in my information. I don't know what a Parhelia looks like because I don't know anyone
Re:How to look at 2D quality? (Score:1)
Re:How to look at 2D quality? (Score:2)
Matrox has great 2D (Score:1, Interesting)
Wrong conclusion (Score:5, Informative)
Geforce 2 cards were known for having cheap-o filters that weakened the analog signal to the monitor. It has nothing to do with 2d vs 3d quality, you bought a card with cheap parts in it. (I did too, that's why I know this.)
Recent Geforce Cards are a lot better. I have dual monitors running at 1600 by 1200, quite clear and readable.
That's quite the norm. (Score:3, Interesting)
There is only one manufacturer with very good picture quality and bearable price -- Matrox. But they are either very slow when it comes to 3D graphics (g400/450/550) or quite expensive (Parhelia). And not too fast either.
ATI (the ATI brand, not the OEM products like powercolor etc) is a little bit worse, but still bearable.
All NVidia cards are total crap, no matter if you chose several years old or top of the line for $400.
I think the consumers are guilty, because they buy more FPS ignoring actual picture quality. Vendors just give people what they want.
Robert
PS. I still use Matrox G400DH because I spend >12h/d in front of the monitor. I swaped Matrox G450 for it because it has better supported tvout under Linux, so my workstation doubles as multimedia center connected to my tv-set.
Built-By-ATI == Sapphire (Score:3, Informative)
ASCII perfomance is even worse... (Score:3, Funny)
it depends (Score:1)
Refresh Rate? (Score:3, Interesting)
Re:Refresh Rate? (Score:4, Informative)
Haven't you got that backwards? All else being equal, a faster RAMDAC will mean a sharper picture. The higher the frequencies, the closer you can simulate sharp corners. The same way (but going the other way) that sampling at low freqeuncies gives a poor waveform.
But you still need quality components and engineering to avoid squandering that advantage. And a lot of video cards do a worse job with a 350MHz RAMDAC than old Matrox cards could do with a 220MHz (from memory) one.
While it is better, a faster RAMDAC usually can't substitute for a crappier one.
Re:Refresh Rate? (Score:2)
No, I don't. As you increase the refresh rate, more data needs to be sent over the cable, increasing the frequency of the signals sent over the cable. As the frequency increases, the signal is more susceptable to noise, decreasing the fidelity
Re:Refresh Rate? (Score:4, Informative)
Ahh, I finally realized how you're thinking ... unfortunately, it's wrong. ;) A RAMDAC is a digital to analog converter, therefore it does NOT sample the way you're thinking of it. It will take each pixel, convert the rgb digital values to an analog voltage and drive the signals out the VGA cable. As the refresh rate increases, these voltages representing individual pixels have to change quicker, increasing the frequency. A faster RAMDAC can therefore drive a higher frequency signal over the cable. However, the freqency of the signal to the monitor is only dependent on resolution and refresh rate, not the maximum speed of the RAMDAC. A higher speed RAMDAC will let you increase the resolution and refresh rate, not (necessarily) increase quality of lower bandwidth modes.
Re:The solution (Score:1)
Also, KVM switches... (Score:2)
Are there any higher-end KVM switches that are low in price to mantain the clarity?
3DFX (Score:1)
I'm still working on a 3DFX Banshee here (voodoo 2.5) My only other cards are voodoo3's, and a card so old that its unsupported in linux(but not win98(!?)).
Re:3DFX (Score:2)
"I'm still working on a 3DFX Banshee here (voodoo 2.5) My only other cards are voodoo3's, and a card so old that its unsupported in linux(but not win98(!?))."
Not sure where you get your info, but Linux doesn't really support anything other than the basics (eg: VGA).
I think you mean XFree86... and yes, the Voodoo3 is well supported [xfree86.org] (it uses the exact same driver as the Banshee, which I also own) and has been for a very long time (note the reference to 3.3.6).
Re:3DFX (Score:1)
The point of the post being the 2D quality of 3DFX verses Nvidia btw.