Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Technology

High Resolution DVI Support for Plasma Displays? 146

spongman asks: "I'm trying to find the best way to connect a computer to a 50" (or larger) plasma display. The display I'm currently looking at is the NEC 50MP2 because its native resolution (1366x768) is high enough to meet my needs and it can display a 16:9 image with square pixels without scaling, but I'm open to suggestions for similarly-capable displays. I also want to use a DVI connection between the computer and the display to reduce interference and noise. The problem I'm having is that I can't work out which video cards support this resolution (or something near it) over a DVI connection. The only card I've found that seems to support this is the PixelPerfect from Imagine Graphics in the UK, but it's based on somewhat old technology (Kyro2) and I'd like a few more choices if possible. Does anyone have experience getting their video card connected to a plasma display over DVI at native resolution?"
This discussion has been archived. No new comments can be posted.

High Resolution DVI Support for Plasma Displays?

Comments Filter:
  • by Anonymous Coward
    That is the only combination I know that will use all of an HDTV's resolution. And it has to be and 8500. The lower end versions wont cut it.
    • AUTHOR addendum (Score:4, Informative)

      by spongman ( 182339 ) on Saturday August 10, 2002 @06:17PM (#4047772)
      First thanks for all your replies, I've gotten some good information.

      I should have been a bit more specific with my question.

      Firstly I want to use the DVI input on the display because I want to get the highest fidelity possible. I also want to be able to use the natvie resolution of the screen because I don't want the screen to have to scale the video image. The problem with using the DVI interface is that the DVI spec allows the screen to tell the video card which DVI resolutions it supports, and even though the screen may have a native resolution of 1366x768 (16:9) it may not advertise that mode to the card, and even if it does the driver or the card may not support that mode.

      I know that most DVI-capable video cards will support video modes larger than that of the screen (1600x1200, for example), but as far as I know the selection is limited when the DVI interface is being used.

      I know of the PowerStrip utility and its ability to create custom resolutions, but it doesn't necessarily mean that the display is capable of being driven at those resolutions.

  • Purpose? (Score:3, Insightful)

    by saintlupus ( 227599 ) on Friday August 09, 2002 @11:52PM (#4043996)
    I suppose which card would be good depends on what the purpose of this is -- specifically, are you going to need killer 3D performance out of this display, or is it going to be a home theatre PC sort of setup?

    You might want to take a look at the Matrox cards if you don't need stunning 3d - my G450 supports a wide range of high resolutions, and it is available with DVI-out.

    --saint
    • Re:Purpose? (Score:3, Informative)

      If you use the Matrox Technical Support Tweak Utility [matrox.com] you can choose the horizontal resolution in steps of 8 pixels and the vertical in steps of 2 pixels. Unfortunately 1366 isn't on the list, the nearest choice is 1368 (though PowerStrip does no better, it has the exact same choice of resolutions with the G450: I wonder if the list changes for different cards though).

      (Not quite what's asked for here, but worth a mention anyway: the dual-head 'DVDMax' output in the Matrox Windows drivers, which displays a video overlay full-screen on a second monitor, is absolutely excellent, and works with the video window in the background: quite useful for displaying video on a larger monitor while using a smaller one to operate the computer, which is ideal for certain residents of Betelgeuse 5).

      • Unfortunately 1366 isn't on the list, the nearest choice is 1368

        That's a hardware issue. This same hardware issue most likely applies to the screen though. Thus the screen he's looking at is most likely 1368 or 1384 pixels wide...

        Oh, and it is no problem if you tell the computer to display two pixels more than fit on the screen. It's not like you'll suddenly get horribly bad quality or something...

        Roger.
  • Mine has a DVI connector on it. Don't know about your specific resolution, but mine supports resolutions up to 2048 x something.
    • Virtually every videocard sold today has a DVI connector (usually via a dongle), so this seems to be a rather odd Ask Slashdot. Of course the resolution being requested (~1366x768) is well within the range of any semi-modern video card, again making one wonder why this was accepted as an Ask Slashdot. I suspect someone wants to gloat about their 50" plasma screen. :-)
      • Well ... perhaps we need a 'Slashdot Show and Tell' category. In fact, I think that it would probably be more interesting than DMCA-story-of-the-day.
        • Ulch - that meat was tainted! You feel deathly sick.

          That really rings a bell...Zork?
          • Aha...was just thinking about it and it's Nethack, right? I remember when a local BBS got a gopher net connection, and I gophered to a site that would email you files : I got Nethack as quite a few emails that I had to manually cat together to uudecode them. Ah, the fun old times.

            Though, I could never understand why it was called "Net"hack when it seemed to be entirely single player.
            • You're right; it's Nethack. I saw that message more times than I can count when I was first starting out. (I can proudly say, though, that I haven't died from food poisoning in quite some time.)


              Oh, and the original game was 'Hack'. The name was changed to 'Nethack' because (IIRC) it was developed over the internet. (And that was quite unusual for the time.) Of course, there are several public Nethack servers around, where you can play and find the ghosts of dead explorers, so it's kinda 'net playable' in that manner.

      • Re:Geforce 4 Ti 4600 (Score:3, Informative)

        by florin ( 2243 )
        Actually, some semi-modern video cards like the Matrox G550 and older versions have a limitation of 1280x1024 for their DVI output. There's a story one step up that talks about a tweak utility which sounds like it might circumvent this, but with the normal drivers settings it is not possible to go higher. The GeForce 4 and possibly the ATI 8500 do not have that problem.
  • Huh? (Score:3, Funny)

    by cascino ( 454769 ) on Friday August 09, 2002 @11:53PM (#4044000) Homepage
    from the displays-of-insane-resolution dept.
    ...because its native resolution (1366x768) is high enough to meet my needs...
    Huh? Insane?
    • I think here, "insane" is to interpreted as "not used by ANYONE".
      • I think a lot of widescreen video projectors and plasma displays use resolutions in that general area.

        It isn't "standard" such that you'd normally see it as an offered option on most computers, as mentioned elsewhere, you can get the video software called Powerstrip to customize your resolution.
    • I think the insane part comes in paying that much for a what amounts to a 50-inch computer display -- not just due to the cost, of course, but that in combination with a maximum resolution being THAT LOW.

      I'm pretty sure 1366x768 on a 50" monitor would look very close to 640x480 on a 21" monitor. Granted you can fit more n a 1366x768 display, but, for chrissake, I run at 1280x1024 and STILL get cramped for room when trying to get something meaningful done.

      But, by the same token, I know some vision-challenged individuals (in dire need of new glasses, I must add) who routinely run 640x480 on 21" monitors. This is the same group of people who put forth a nonstop litany of complaints about UI being too big and fonts being too small (!!) based on whatever it is they're doing (or trying to do, at least).
  • I'm intrested in how this will turn out. My question is why plasma as opposed to a huge crt? is it the flat screen? I thought crt was better quality anyway.
    • Re:hmm.. (Score:2, Informative)

      by ergo98 ( 9391 )
      A 50" CRT? I don't believe such a thing is even made, and if it were the weight would be monstrous, not to mention that it'd be huge not only diagonally, but depthwise, consuming a huge portion of the room.

      Plasma screens, on the other hand, can be made 3.8" deep [marketware-tech.com], and the power consumption (and hence heat dissipation) of plasma (& LCD) screens is dramatically lowered. Most LCDs and Plasma offer much better contrast than CRTs, and the only real critique of them is ghosting in some lower cost models, but that's mostly a complaint of yesteryear.
      • And if I'm not mistaken less / no radiation from the screen :)
        • Yup, very true. Indeed, perhaps I'm misreading, but the screen that I linked to seems to only consume about 8W...I find it hard to believe it could be that efficient.

          Electric Current Consumption 0.8A (DC9.3V)
      • Not to mention that even the best CRT TVs _CANNOT_ display a horizontal resolution beyond 800-900 pixels. Plasma screens have much better resolution than CRT TVs.

        Actually, there is a CRT TV made by Sony, that does go beyond 900 pixels. It's worth nearly $40k USD. Definitely not a consumer product.

        http://bssc.sel.sony.com/Professional/webapp/Mod el Info?p=8&sp=20073&id=57474

      • I saw a 50 inch CRT at an old museum (they were showing a movie on it or something. The thing whs massive it had to be at least 5-6 feet long (from electron gun to front of screen.)
    • CRT is better quality than LCD, for sure...
      But plasma?

      I saw a plasma screen at LG the other day.... seen it for the good part of a year, every day.

      It's the most stunning video quality I've EVER seen, anywhere. Crips, accurate, high-res, wicked contrast, you can see it from a mile away in the mall, insane viewing angle.

      It costs about $30,000 though.

      If I had to guess, I'd say that a plasma screen can probably be made to be superior to a CRT.
      Not cheaply.
  • I'm trying to find the best way to connect a computer to a 50" (or larger) plasma display.


    That's easy ... plasma conduit.
  • Similar Dilemma (Score:3, Informative)

    by raiyu ( 573147 ) <raiyu.raiyu@com> on Saturday August 10, 2002 @12:00AM (#4044015) Homepage
    I had a similar problem when I originally got my SGI 1600sw flat panel LCD [sgi.com] a few years back. It ran at 1600x1024 native, and to accomodate this unstandard (at the time) resolution, you were forced into using one of their "pre-approved" graphics cards. At the time they were fairly decent, but eventually they got old, without being able to upgrade, since the default connection was OpenLDI digital (not DVI), you had to buy an adapter (Usually out of stock and $600 retail) or get a new monitor.

    If there are enough monitors made at 16:9 instead of 16:10 in time 16:9 resolutions will be standardized. So if you plan to use this monitor for a few years, I think eventually you wont have any graphics cards problems. But until then, I would say the GeForce cards are very accomodating to non standard resolutions. I cant say much for 16:9 but I know that all VisionTek GeFroce2+ cards and certainly 4+ cards support 16:10 resolutions. GeForce2 was one of the few vendors that also supported 16:10 when there were only a handful of monitors running at that resolution.

    If you cant find anything that runs 16:9 and that monitor can rescale to 16:10 then I think a decent Geforce4 wont be much of a compromise. Plus, you should be able to hook up your DVD directly to it and still have it hooked up to your PC so you can still watch your movies at 16:9.

    My next monitor is the SGI F220 [sgi.com] which im ordering next week, and lucky me I can get a GeForce4 to render, now I just have to find out if its compatible with non-SGI systems. ^_^
    • Hmmm... I've seen bigger and better quality LCD panels from Saumsung, and probably cheaper too. Check out the Samsung SM181T, SM191T and SM210T. These screens have some great features such as 250cd/m2, Contrast 500:1, View-angle 170/170 (H/V), Response Time 25ms, Analog & DVI-D Digital connectors etc... Make sure your new LCD screen has a good Contrast setting and has a good response time (if you intend to play games on it.)
    • I don't know what you'll be paying for the SGI display, but since it hasn't exactly been the budget brand earlier I think the similar-spec Apple displays might be a cheaper alternative.

      In short, Apple has a 1600x1024@22" display for $2499 and a 1920x1200@23" display for $3499.

      Apple has a custom display connector for which you'll need a $149 adapter, but the proprietary format is quite useful: a single cable will carry power, DVI and USB to the display. This means that you can hide a noisy six fan Athlon beast in your closet (up to 15ft away with an extension) and only have one cable coming to desk, with your keyboard, mouse and speakers attached to the display. Neat!
      • The SGI F220 is the next generation of the 22" Apple Cinema Display. I'll leave the details as an excercise for the reader.

        One of the cool things about the F220 is that is has VGA, DVI, S-Video, and composite video inputs. You can switch between them from the front panel, or with the handy remote control. It even has picture-in-picture.

        So it's not the cheapest, but it does have some extra features some users would be interested in.
  • Do a search for "powerstrip" on download.com. For certain sure, you can use an ATI 8500, or GeForce 4 anything.

    (Lots of other cards will work, but these are the only ones I can personally attest to.)

  • ATI sells a DVI -> component video adapter for their cards (may work with other brands, too, but I have no idea) that claims to support 480i/p, 720p, and 1080i. I doubt that most cards could output an interlaced signal natively for any resolution, but it would make sense that the ATI cards could output the 720p signal directly throught the DVI connector without the adapter in place.
  • Try Matrox (Score:3, Informative)

    by -Surak- ( 31268 ) on Saturday August 10, 2002 @12:04AM (#4044027)
    I'm not sure about your specific application, but Matrox [matrox.com] has always been pretty progressive with DVI support. I think they have at least one model of the G400 series with two DVI ports, and the new I think the new Parhelia card has three.
    • Nope - Parhelia only has two video out - 1 DVI and one analog/VGA, with a splitter cable for the VGA port. That 'triple head' is only available in windows (atm), because it's doen in software.
  • If i can remember correcty, some ATI cards support DVI to either 1280x768 or 1280x1024, can't remeber which tho,, unfortunatly, i'm still stuck in the analog world :(

    Reece,
    • wait, i checked, it can go higher, i think that's just the limit of the monitor that was attached te it, sorry, Reece,
  • save yourself a couple grand and get a wall projector
  • My ATI All-In-Wonder Rage 128 Pro supports the 16:9 aspect ratio; however, it doesn't have DVI. You could use it plus a DVIator from Dr. Bott's ...
  • Powerstrip (Score:5, Informative)

    by Devil's BSD ( 562630 ) on Saturday August 10, 2002 @12:11AM (#4044051) Homepage
    I use PowerStrip [entechtaiwan.com] to control my video card. If you get a card with DVI out, this program should support it. It supports about any card under any O/S, too. In the Display Configuration, you should be able to configure custom resolutions. One of the presets is already 1360x768. A few more clicks should get you to 1366x768.
    • 1360x768 is otherwise known as "Wide XGA", and it would be the computer "native" resolution for that screen.
  • Almost anything new (and decent) these days will support up to 1600x1200 with the DVI interface (eg the Geforce4). Watch out though, the max resolution of the cards are often not supported with DVI.

    Using Powerstrip (etc) as others have mentioned should get you the native screen resolution.

    Lucky guy ... The NEC can do P-I-P so you can eg. watch TV and the computer at the same time. I've dreamed of attaching 2x of the Panasonic 52", but cannot justify spending $25000 on monitors.
  • when the website says:

    PC Signal Compatibility:
    VGA 640x480 @ 60,72,75,85,100, 120 Hz
    SVGA 800x600 @ 56, 60, 72,75,85,100,120 Hz
    XGA 1024x768 @ 60,70,75, 85, 100 Hz
    SXGA 1280x1024 @ 60, 75, 85 Hz
    UXGA 1600x1200 @ 60, 65, 70, 75 Hz
    WideVGA 848x480, 852 x 480 @ 60Hz
    WideXGA 1360 x 768 @60Hz

    Macintosh Compatibility:
    640x480, 832x624, 1024x768, 1152x870

    cant you just settle for a slightly different res?
  • Don't forget about digital projectors. I was at a home movie night not too long ago where someone brought one from work. What a neat toy! We had a screen set up a one end of the living room with an absolutely huge, crisp picture. I was really impressed.

    Then when it was over, we folded up the screen and put the projector in it's cabinet, and all that space was reclaimed. If you had a permanent projector mounted strategically, and one of those automatic screens that roll up into the ceiling, it would be pretty sweat.

    Just throwing random ideas at ya...

  • Some pointers (Score:5, Informative)

    by x mani x ( 21412 ) <<ac.lligcm.sc> <ta> <esahgm>> on Saturday August 10, 2002 @12:45AM (#4044192) Homepage
    First, to answer your question:

    Any modern ATI or Nvidia card should work just fine. Plasma displays are very sensitive with regards to having their exact resolution displayed, so use a program like Powerstrip to make sure Windows starts up with the exact resolution and refresh rate your plasma monitor requires.

    If you haven't bought a plasma display yet, then I recommend you think twice about getting it. There are some really low cost monitors out there that can interface pretty well with a PC. Take, for example, the JVC AV-48WP30, at around $1,700 you can have a 48" HDTV that supports DVI(*). People are using this TV with their PC's at 1280x720, or 1920x540. There are also new 42" (HLM427W, I believe) and 50" (HLM507W) Samsung HDTV's that support DVI and are based on badass DLP technology (I heard this set is particularly sharp when connected to a PC). These Samsung DLP's are MUCH cheaper than other comparable sets, something like $3,000-$4,000.

    Note, however, that while the theoretical HDTV resolution is 1920x1080i, very, very few HDTV's can display a discernable pixel grid at this resolution. Still, the price difference between a modern rear-projection HDTV and a plasma monitor is significant (you can buy a decent used car with the money you save).

    Here are some very helpful links, I used them extensively when I was shopping around for a new set:

    AV Science Forum [avsforum.com]: great forum with lots of very knowledgeable people. Many of them are into using displays like plasmas/HDTV's with their PC's.

    Home Theater Spot [hometheaterspot.com]: similar to the above, different layout. Another great, helpful site.

    (*) Regarding these DVI connectors - yes, these are the new DVI connections used to transfer encrypted data to prevent people from copying future HD broadcasts. It is often documented that you can't use this DVI interface with your computer's DVI out, but more often than not this is not true and it will work just fine. However, ask around on the above sites about your particular DVI TV before buying an expensive DVI cable. :)
    • Beware the GeForce2 (and probably GeForce4MX) and most GeForce cards when it comes to high resolution DVI. Most GeForce3 cards seem to do a fine job up to at least 1600x1024. GeForce2's and 1's need to have an external Silicon Image chip in order to support high resolution (over 1024x768) DVI. Just check the board for a chip with a Silicon Image logo on it, and it should be able to handle higher DVI resolutions. YMMV.
    • Another recommendation for the http://www.avsforum.com/ . Especially check out the HTPC (Home Theater PC) group and the Plasma group.

      Also, I don't think anyone has mentioned anything about burn-in. With plasma, you have to take precautions to prevent this, especially if you are planning to use it with lots of static images. I have a 42" panasonic plasma display (I love it for movies btw). All I've had to do is make some adjustments in the brightness and picture settings along with some modification in my viewing behavior and I've had no trouble with this at all. Do a search on the plasma forum and you'll get good advice on this subject.

      Cheers, and good luck. --Karl

  • First, you nedd to check if the manufacturer requires a single or dual TDMS input. A single DVI TDMS channel maxes out at 165Mhz. 1600x1200 @ 60hz with a 5% blanking overhead is about the limit of a single channel DVI connection. IF your device requires more then 5% of the total bandwith for blanking (and many do) then both the available bandwith and resolution drops. Depending on the required refresh rate and the blanking bandwith(time) required by the device, you may be exceeding a single TDMS output. Second, you need a card with a TDMS that will output the full 165Mhz of bandwith. Some low end devices will not do this. For instance, the integrated GeForce2 TDMS had problems (maxed out at like 800x600), requiring an aftermarket TDMS be integrated on the card. This is why some large format LCD displays support 1280x1024 in DVI and 13xxish x something or another in analog mode (a samsung as I recall) for more information see http://www.ddwg.org Paul Driver
  • by FreeUser ( 11483 ) on Saturday August 10, 2002 @12:59AM (#4044255)
    Although my ATI 8500 should be able to drive my HDTV-ready monitor at 1920x1200 resolution, I've yet to be able to coax the X driver into delivering that resolution through the DVI interface.

    However, using the Nvidia binary-only X 4.2 drivers I have no trouble driving the monitor in 1920x1280 24-bit color resolution with a GeForce4 Ti4600.

    Such a setup should work fine for a relatively low-resolution plasma like the one you are considering, at 1366x768 ... indeed, I suspect you could easilly coax that out of an ATI 8500 or ATI 7500 under XFree 4.2, and almost certainly if you use ATI's drivers.

    If you're going to spend that kind of money on a plasma, though, I'd wait a couple of years, until they support true 1080i at least. 768 lines of resolution is analogous to 1024x768 resolution on a computer (yes, I know you get more horizontal pixels in a 16:9 format, my monitor is 16:10 so I'm intimately familiar with that), so keep in mind that you are buying an expensive product whose resolution will likely be disappointing to you in a couple or three years.
  • The quality of plasma displays is low enough that you almost certainly won't be able to tell the difference. Just get one with a standard VGA input. If you need to run the VGA cable a long distance, Blackbox.com offers boxes that let you use standard VGA with long cables. If you really want DVI, get a DVI-to-VGA converter and put it next to the screen. Keep in mind that the signal eventually will be analog anyway.
    • The plasmas I've seen all contain 15pin VGA connections for certain. And if you really want a 50" screen then why not go for Viewsonic's model? Last I read it was able to to 1600x1200 which I would imagine on a screen that size would be as hideous as running 640x480 on my 21" tube. If you're going big and flat then go for the most pixels man. And if you're on a budget Sony's 42" does somewhere in the range of 1024x768.
    • by Anonymous Coward
      It's not merely an issue of resolution. Walk into a Fry's or CompUSA. Look very closely at the monitors and at the edges of high contrast objects onscreen. There are all sorts of defects like ghosting, etc.
      • At Fry's or CompUSA, those are many monitors connected with very long cables to splitters and a single signal source. Of course, you get ghosting and image defects. The plasma monitors are probably connected to composite video. But if you plug one monitor into one VGA card with a good cable of reasonable length, you won't be able to tell the difference.
      • It's not merely an issue of resolution.

        By the way, I didn't say that it was an "issue of resolution". I didn't even mention the word "resolution".

  • This really scared me at the first look. DVI standed for something named DeVice Independent at good old days, when everyone still used TeX to format her thesis. Oh, just how sweet were the days, when you knew every nice girl will someday need you, the local TeXpert.

    Now we're back to troff again...
  • there are a lot of cards out there that will do 1600x1200 or better via dvi.

    i know for a fact there are geforce2mx cards that can push out 1920x1200. search around for samsung 240t users and you'll find plenty of hi-res (1920x1200) dvi cards.

    however, unless you plan on keeping your computer pretty close to your display, i think your bigger problem will be the length limitations of dvi cabling: 3-5 meters.
  • Beware of HDTV fakes (Score:1, Informative)

    by Anonymous Coward
    HDTV resolution is 1920 x 1080 ..

    There are MANY MANY manufacturers selling TVs with SVGA or XGA resolutions trying to pass it off as HDTV.

    Be extremely careful of this .. whenever ur buying a so called HDTV ask to find out the actual resolution and make sure it's 1920 x 1080. Here's a tip if you're not paying thousands of dollars ($7k up) you're probably not getting HDTV.

    What happens? People buy it thinking thats true HDTV and think HDTV is no big deal and a waste of money. And in turn the concept of HDTV gets screwed.

    Honest business practices .. will it ever be a reality?
  • The answer (Score:2, Informative)

    by Stonent1 ( 594886 )


    With power strip you can dial up any resolution that you want and create a profile for it. Take a look at the nice PNG screen shot and see for yourself.
  • I have an idea. POWERSTRIP!

    I know for a fact that Powerstrip will allow you to set the video card to custom, nonstandard resolutions. Try than and a standard Radeon 8500 (you can get them at pricewatch for $109, a stock 8500 at 275/275). Excellent 3d performance and tons of features.

    However, I'm not entirely certain that the DVI output is compatable. In all likelihood it is.
  • If you can afford an $8000 plasma screen, you can afford to blow a few hundred dollars on graphics cards and buy one of each :)
  • Post your question in the Plasma forum on AVSForum.com
  • Now come on. What the fu^H^H heck do you expect from us? The nice thing about high-end equipment of any sort is that you usually buy it through a very experienced salesman. Even if you can find a better deal somewhere else, just get the rundown from a good salesman. You might have to try a few, but it's not bad because you can easily tell if someone knows what they're talking about. The right guy (or gal if you're offended by that) is bound to know more than most of us.
  • Check out http://www.avsforum.com
    I just baught my self a NEC 42MP3 plasma, its awesom. The guys at avs form, gave me lots of tips etc on setting it up....

    Dan
  • by SectoidRandom ( 87023 ) on Saturday August 10, 2002 @03:47AM (#4044720) Homepage
    A few months back I was setting up a Plasma screen for a client, connected to just a Geforce 4, the big problem I found immediatly was the connector! DVI is not just one plug!

    It turned out the Plasma screen (Pioneer I think) has a DVI-D connector and the GeForce (and every other video card I checked) has a DVI-I connector, they are not compatible! :( What I had to do after talking to both an A/V suplier and Pioneer was use a DVI-I to VGA cable as no DVI-I -> DVI-D exists! Apparantly the quality loss is insignificant, but it just didnt feel right. :( Still it did look brilliant at 1024x resolution!

    From what I was told by various sources is that DVI-D is purely digital, where as DVI-I is part analogue, I never had time to find out more, but if anyone can clarify this and why it would be something to take into consideration.
    • DVI Info (Score:5, Informative)

      by SectoidRandom ( 87023 ) on Saturday August 10, 2002 @03:50AM (#4044730) Homepage
      Just found this info on DVI connectors: All About DVI [datapro.net]

      Hope this helps..
    • That's just wrong. All you needed was a DVI-D to DVI-D cable. The DVI-I connector can accept either type because it has all the pins for both DVI-D and DVI-A. Most projectors and LCD monitors that have DVI-I connectors will come with a DVI-I to VGA cable and a DVI-D to DVI-D cable.
    • Well, you got scammed - you didn't need to go back to VGA. if you got a DVI-I plug and a DVI-D plug, You can use a DVI-I or DVI-D to connect the two digitally.

      Check out this passage from a DVI faq:
      "
      If you have plugs that are DVI-D, they will accept a DVI-D or DVI-I cable. If you have plugs that are DVI-A, they will accept a DVI-A or DVI-I cable. If you have plugs that are DVI-I, they will accept any type of DVI cable.

      If you have mistmatched plugs, such as DVI-D and DVI-I or DVI-A and DVI-I, you may use either a DVI-I cable or the cable that matches the other plug. For example, you may use a DVI-D cable on a DVI-I to DVI-D connection, but not a DVI-A cable.

      Note: You may not mismatch a DVI-D and a DVI-A connection.
      "

      There.

    • It turned out the Plasma screen (Pioneer I think) has a DVI-D connector and the GeForce (and every other video card I checked) has a DVI-I connector, they are not compatible!

      Yes, they are.

      Your GeForce has a DVI-I output connector on it, which outputs both digital and analog versions of the SAME signal. You can screw a small adapter onto that DVI-I connector which converts the analog signal, ground, and DDC pins to a normal VGA HD15 connector to use a normal CRT monitor with.

      However, if you plug a digital flat-panel display with a DVI-D input into that same connector on your GeForce, it will use the digital part of the output and will look better than if you used the analog part (assuming your monitor has both DVI-D and HD15 connectors like mine [alexburke.ca].

      So, yes, it will work. Plug it in, and if it's the only thing connected to the card, the card should detect that and use the digital output automatically without you having to connect an analog monitor and switch it over to digital output in the Display Control Panel.

      You're welcome.
    • With these two all you should have needed was a simple DVI-D to DVI-D cable. The DVI-D cable will plug into the DVI-I connector on the card since the connector is female and the cable is male, so the extra four pins and key used by DVI-I to carry analog signals and prevent you from plugging DVI-I cables into DVI-D connectors just don't matter.

      All you'll lose is anything displayed before the video card drivers "kick in" to digital mode. On a typical Windoze box this means you don't see anything until the desktop, not sure how things work elsewhere.

      Example - my PC had a GeForce 3 Ti 500 with a DVI-I output. My monitor was a Samsung Syncmaster 170T with DVI-I input. Then I threw a Mac into the mix, which had an ADC to DVI-I connector hooked up to it. With me so far?

      Then I wanted to use both computers with the same display using a KVM switch. I eventually found two different KVM switches to switch a) DVI and b) USB. There was the very expensive yellow one that did it all in solid state electronics and didn't include any cables, and there was the cheap one silver with the big knob on the front that was quite literally a four-position mechanical switch, and included all necessary cables to hook up four machines to a DVI display.

      I bought the cheap one. All of the cables provided with it (and all of the connectors on the switch box itself? - can't be bothered looking right now) were DVI-D. I hooked it all up. It worked. Well, not really, it had pretty severe pixel sparkle in some places beccause two 6' DVI cables with a mechanical switch in the middle takes you over the regular DVI cable length limit and can introduce noise, but it worked in principle, and using one of the DVI-D cables to connect the DVI-I graphics card directly to the DVI-I monitor worked just fine.

      It should definitely give you a far better picture than using a DVI to VGA converter.

      As for DVI-I being part analogue and DVI-D purely digital. I won't go into all the details (such as the all-analog DVI standard) but basically DVI-I is DVI-D but with four extra pins to carry a standard analog VGA signal. They're arranged at one end of the connector around a cross-shaped key. DVI-D connectors don't have a slot for the vertical part of the key, thus preventing you from plugging DVI-I cables into DVI-D sockets, but there's nothing to prevent you from plugging DVI-D cables into DVI-I sockets - you lose the analog part of the signal, but typically the only time you'd see anything coming over that is during stuff like BIOS and Windows startup (on Windows). Macs go straight into digital the minute they start. Linux boxes - I imagine it's all down to the specific display driver/

  • I've been around the video industry since Motorola came out with the "works in drawer TV" in the 60's. As far as I'm concerned you should get as far away from an NEC product as you can. They seem to work under the guidelines of "build it, then dump it." They have designed and produced so many IC's and products, then dumped them before the end of life of the product that they should be prosecuted. JUNK,JUNK,JUNK! They are the epitome of the phrase "Japanese Junk".
  • sacrifices (Score:4, Insightful)

    by DarkHelmet ( 120004 ) <mark AT seventhcycle DOT net> on Saturday August 10, 2002 @04:01AM (#4044762) Homepage
    I'm trying to find the best way to connect a computer to a 50" (or larger) plasma display.

    I'd sacrifice my ex-girlfriend to the monitor god to get one of those plasma displays.

    Two birds with one stone.

  • Plus awsome quality of course.
  • I'm looking into getting my first LCD display for my PC. One thing I find puzzling: How come every (?) 15" stand alone display out there has only 1024x768 pixels, when notebook makers seem to have no problem with giving you 1600x1200 on the same area? (I'm thinking specifically of Dell's "UXGA TFT" displays). Even budget notebooks from Dell (sub $2000 range) give 1400x1050. I would have thought the technology would be exactly the same, no?
  • Has anyone else noticed that most recent movies are being released with an aspect ratio of 2.35 instead of 1.85? this means all those beautiful 16:9 TVs will have to letterbox the 2.35s if you don't want to streatch the picture out vertically.

    I think that kind of sucks.
    • The 2.35 releases are just giving you the film as it was shot, to keep the composition correct. Unless the TV screen is the same shape as the original film, there's no single way of displaying the image that will please everyone.

      At least if the DVD is released at 2.35, you can have your own equipment do the pan-and-scan if you don't like letterboxing. If it was released in 16:9 with parts of the original picture removed, there would be no way to recover them.

      Of course 1.85 isn't quite 16:9 either..

      • Can you really have your own equipment do pan and scan? How does it know what part of the picture to look at?

        On a related note, I personally think that LCD projectors are going to improve to the point that people will want to go back to that method instead of a projection TV. I know I am almost to that point.
        • I know that a lot of the software players will do pan and scan. I have seen 'aspect ratio conversion' in feature lists for hardware players, though since I've never used a hardware player I wouldn't know for sure.

          Certainly in the UK, broadcast digital television allows a choice of aspect ratio (changed in the setup menus of the set-top box), so it would seem a bit strange for DVD video to be any different. I don't really know how that would display a 2.35 picture though, it might just be for 16:9 to 4:3 conversion.

          I can't find any references about how the player decides which part of the picture to zoom in on, it would be most sensible for the position information to be encoded on the disk, but I don't know if that's actually the case.

  • ATI has been a leader in offering multiple types of resolutions. I would highly suggest looking at the Radeon 9700 Pro and see if it will support this resolution. If not currently, it will eventually as this is only a driver issue. ATI would consider adding this to their driver to allow Plasma screens be properly visible.
  • I have a dell 2000FP LCD monitor that I'm using with an analog input right now. The image looks fine at 1600x1200 with my old matrox millenium II card.

    I'm curious to see if anyone has gotten the 2000FP's DVI input to work with more modern 3D cards at 1600x1200 in Linux. In particular, I'd like to know which cards actually work at this resolution in DVI mode. Thanks!
  • Be careful with slight incompatibilities between DVI specs ...

    As an example, I have 5 pieces of equipment:
    1) Apple Cinema Display 22" (ADC connector)
    2) PowerMac G4/466 (GeForce2 MX video board w/ADC)
    3) PowerBook G4/800 (w/DVI-I output)
    4) Sun Blade 1000 w/XVR-1000 video board (w/DVI-D output)
    5) Sun 24" LCD (1920x1200 native; DVI-D & VGA in)

    Now, ANY of the equipment can drive my Apple Cinema Display fine (I have the Apple DVI-to-ADC converter also to support the PBook & SunBlade).

    The Sun display looks GREAT on the Sun Blade, but it's awful on the PowerBook. The PBook supports its native resolution at 1920x1200, but I get all sorts of interference on the screen. Bits of "static" all over the display. If I use the VGA connector, it looks fine.

    And it's not my display or PBook, because I've swapped both with other people @ my office and we all have this problem.

    GRR... anyone know what the F___ is going on?

    Why would I get that static? Now, an interesting point, the Sun docs for the 24" say that it only supports DVI-D and *NOT* DVI-I. The PBook is DVI-I. I thought that using a DVI-D cable with a machine with DVI-I output would act just like DVI-I (just ignoring the analog pins)?

    The cable I'm using doesn't have the 4 analog pins around the "crosshair" so to speak.

    Anyone have this LCD? Has anyone seen this behavior before?!?!

    --NBVB

PURGE COMPLETE.

Working...