Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Displays Hardware

Why Hasn't the DVI Interface Replaced D-Sub? 156

nic1m asks: "When DVI connectors started appearing on video cards I thought they were a smart replacement for the old D-Sub analog connector because DVI can support both digital and analog displays. With LCDs rapidly gaining market share I would have expected DVI to replace D-Sub by now. Almost the opposite seems to be happening, however. Many video cards still lack DVI, most LCDs still have only an analog input, and motherboard-based graphics never have DVI. Why has DVI been a relative failure in the market?"
This discussion has been archived. No new comments can be posted.

Why Hasn't the DVI Interface Replaced D-Sub?

Comments Filter:
  • Simple (Score:5, Informative)

    by caperry ( 31048 ) on Thursday January 29, 2004 @12:28PM (#8124330) Homepage
    6 foot cable length at resolutions over 1024x768. Not so much a problem on monitors, but the projector on my ceiling need a $700 DVI-fiber-DVI cable to go lengths over 6ft while still remaining in spec.

    Most good Flat Panel displays (Hitachi, Sony, etc 17" and up) do support DVI - but DVI on Analog CRTs doesn't make much sense.
    • Re:Simple (Score:4, Interesting)

      by dFaust ( 546790 ) on Thursday January 29, 2004 @12:52PM (#8124601)
      You're right, DVIs on CRTs don't make much sense... but have you been to CompUSA or Best Buy lately?? Check out their monitor display, the vast majority of them are LCDs. CRTs are becoming increasingly difficult to find in retail, hence all the more reason you would think DVI connection would be becoming more abundant.

      Part of the problem is that, in fact, many LCDs do NOT come with DVI connections. You say "most good flat panel displays" do, though that's not quite accurate. You mention Sony 17" and up... well, this 17" Sony doesn't use DVI [sonystyle.com], nor does this 19" Sony [sonystyle.com]. Or how about this 24" Samsung [samsung.com], which includes connections for D-Sub, S-Video, RCA, Component (x2), Coax, and Scart (but no DVI) and will set you back $3-4k.

      The fact is, contrary to popular belief, the majority of LCDs still do not come with DVI, whether budget or high-end. I learned this during Christmas when I had to shop for an LCD for my mother. Sadly, often times if you want a DVI connection, you pay MORE than the identical model which uses a D-Sub connection.

      Which brings us back to the original post... WHY is this?? Doesn't DVI on a video card or LCD mean not having to use a DAC on the hardware? Which you would think would cut costs?? Not to mention DVI provides better quality to an LCD than D-Sub does... you would think monitor manufacturers, at least, would appreciate making their hardware seemingly perform better while saving money??

      Hopefully someone will have some insightful knowledge to clue us in on the this seemingly backwards situation.

      • you also pay more for a galon of bottled water than you do for gas. how meny people complain about that?
        • This would be a good example, except (a) a lot of people complain about the cost of bottled water and (b) gas, around here, is about $1.50.... water is about $0.39, bottled. So, yeah, use the same kind of evocative metaphor, with less lies, and it'll make a good point.
          • ... Maybe it's just where you live.... here in Toronto, gas is ~.75 per litre and a litre of bottled water will run $1 to $2.
            • It depends more on the water you are buying. If you buy the single serving bottles, sure, you might as well ingest them goatse-style. However, if you buy the big 5 gallon (19 liter) water-cooler bottles, then you will most likely be paying a lot less on a per-liter basis.
            • Sure, I live in toronto too... and a single bottle of water does run $1-$2, more if you're at a club (I've payed as much as $4.75).
              The bottled water that your parent is talking about, however, is the bottled distilled water you buy at the supermarket.
              I believe it comes in 10 or 20L bottles, and they're cheap... much cheapter than gasoline by volume.
      • Re:Simple (Score:3, Insightful)

        People don't get rid of their monitors as fast as they get rid of other stuff. During a time period when I upgraded my computer almost every year, I stuck with the same CRT monitor I bought in college. Now it's seven years old and my computers last longer, but I still use the monitor. It's a 17" Dell CRT, bought refurbished in 1997 for somewhere between $150-250. Heck, I even switched to the Mac platform in 2001 and I'm still using the monitor.

        It's the monitors that need backward compatibility. Unless I'm
      • The main reason I see is that DVI supports both analog and digital. So you could have a LCD with only digital in in a DVI connector, but people would expect it to work with their analog connector if the get the adapter. The adapter fits, but it doesn't work.

        To solve that problem, you have to have both a digital and analog thingy in the monitor.
    • Re:Simple (Score:3, Informative)

      by jjshoe ( 410772 )
      We use 16' dvi cables in operating rooms without any issues.
    • Re:Simple (Score:5, Informative)

      by terpia ( 28218 ) on Thursday January 29, 2004 @02:22PM (#8125664) Homepage
      but the projector on my ceiling need a $700 DVI-fiber-DVI cable to go lengths over 6ft while still remaining in spec.


      Wrong. You've been misled. DVI at half bandwidth can travel 30 feet on a good cable. (half bandwidth currently is how computer data and HDTV are transmitted via DVI) DVI at full bandwidth travels 15 feet. And a limit of 1024x768? Not true at all. I'm running two 30' DVI cables. One to a projector sending 720p (1280x720) and one to an LCD display @ 1280x1024.


      DVIgear.com's salesmen are good aren't they? ;)

      • Re:Simple (Score:3, Informative)

        by madcow_ucsb ( 222054 )
        Yeah my friend's running 1600x1200 over 18' so he can have his computer in the closet and his 20" LCD mounted on the wall above his desk.

        Picture looks great (and a pretty cool setup too as long as you don't need to change CDs...)
        • Re:Simple (Score:3, Interesting)

          by tigersha ( 151319 )
          I did that and I have my DVD drive in a USB 2.0 case and it works very, very nicely, thank you.
          • Well, that works, although high-speed USB has a max cable length of 5m. Not long enough in this case, and you're definitely taking a chance when using a non-compliant cableor a full-speed cable. HS USB 2.0 is pretty unforgiving when it comes to signal quality issues.
    • Re:Simple (Score:2, Informative)

      by Theosis ( 261389 )
      but DVI on Analog CRTs doesn't make much sense.

      I think it does. The DVI spec allows for analog signals. Check the pinouts for DVI connector. Apple only has DVI/ADC , and they include a VGA adapter for the ADC. I also own a DVI to VGA adapter and it works just fine.

      DVI should replace VGA.

      (ADC is an Apple only connector that is basically DVI with USB and power pins included).
      • Correct. My video card has dual DVI out, with two DVI to D-sub converters, which are just simple wirings between the two connectors, since the DVI outs have both digital and analog present.
    • FWIW, I'm currently running 1920x1200 @73Hz on a 6' VGA cable. If there is any image degredation, it is below a threshold of where it can be noticed. Really nice image.
  • by fruitbane ( 454488 ) on Thursday January 29, 2004 @12:29PM (#8124342)
    Dell, lately, has been shipping lots of FP monitors and video cards with DVI connectors. The caveat is that Dell has been using lots of weird monitor connectors for which we have to use odd dongles (and boy do we have some odd dongles cluttering up our desk drawers now, thank you Dell).

    So I guess the question should be, why has DVI been so slow to penetrate the low-end/mainstream/low cost market? I imagine the DVI connector is a more complicated part and more expensive to produce, especially if you're like Dell and using weird connectors that require the extra expense of dongles. Additionally, if all you use is a DVI you have to include a VGA adapter, another item that slowly pecks away at the bottom line.

    Let's face it, for most hardware manufacturers what's cheaper than simply using the old tried and true D-sub VGA connector?
    • by Joseph Vigneau ( 514 ) on Thursday January 29, 2004 @12:48PM (#8124557)
      I recently bought a pair of Dell 1800FP panels (one for work, one for home), because I needed the space, and my eyes needed the rest. I chose the 1800FP because it was relatively inexpensive, fairly large, and has DVI connector. I have a GeForce3 I bought two years ago that has DVI output. My picture is crystal clear; I have no complaints.

      Manufacturers are catering to the lowest common denominator- the "good enough" theory in action. This is also why the market is being flooded with cheap 42" plasma displays, that only have 864x480 (ie, non-HDTV) resolution, often without DVI. Most people just want the sexy thin screen, but don't care or don't realize how low the resolution is, and what they're missing out on.

      DVI isn't necessarily in the domain of high-end, but you have to look a little harder to get it.
      • Most people just want the sexy thin screen, but don't care or don't realize how low the resolution is, and what they're missing out on.

        Most people probably aren't missing out on anything. I recently bought a 42" HD monitor, but I don't have any HD video to watch on it. I mostly bought it because I mainly use my TV for watching movies and I prefer 16:9. I doubt I would notice much difference between 1920x1080 and 864x480 in my typical usage, except perhaps that my SD cable signal might not look quite as cr
    • Dell, lately, has been shipping lots of FP monitors and video cards with DVI connectors.

      Definitely. I have two of their 20" LCD monitors on my desk at work. They're pretty nice displays: bright and crisp. I like the Samsung I have at home a little bit more because it has slightly better contrast, but the Dells are quite nice. The ones we have at work have 4 (!) inputs: DVI, D-Sub, S-video, and composite. It looks great with the DVI input but is a bit fuzzy if you use the D-Sub input, even at the sam

    • Dell's 1800FP is usually one of the cheapest DVI-capable monitors available.

      When I bought it about a year ago, I paid $530ish for it.

      The cheapest non-Dell 17" LCD with DVI I could find was over $600.

      So I paid $70 less, got DVI, and got an extra inch of diagonal...
  • HDMI (Score:2, Informative)

    by Who Man ( 671061 )
    Maybe they're skipping DVI, since it already has a replacment.

    http://www.hdmi.com

    • Re:HDMI (Score:3, Informative)

      by Ianoo ( 711633 )
      From the HDMI FAQ:
      Yes, HDMI is fully backward-compatible with DVI using the CEA-861 profile for DTVs. HDMI DTVs will display video received from existing DVI-equipped products, and DVI-equipped TVs will display video from HDMI sources.
      It seems HDMI is more of a redefinition of the existing DVI standard to support consumer devices like televisions.
    • HDMI is bad news for consumers as it incorporates Digital Restrictions Management (DRM).

      "Windows has detected you want to connect a high resolution display to your computer. Your current Windows license doesn't allow displays over 1024x768. If you wish to upgrade, please insert appendage you wish to pay with..."

      • "Windows has detected you want to connect a high resolution display to your computer. Your current Windows license doesn't allow displays over 1024x768. If you wish to upgrade, please insert appendage you wish to pay with..."

        Sorry, I don't think this is an issuue because it is always possible to prevent non users of DRM from setting their display resolution above a certain size. I mean, you do have to configure windows to set the display resolution. If they don't want to support that resolution it doesn't

        • Yah, true, but it's just a joke. A little karma-whoring, I admit.

          But I do see how HDMI could enable an unscrupulous scumbag^H^H^H^H^H^H^H companies to sell you a high-res screen, then sell you "licenses" to the different resolutions. Kinda like IBM does with crippled SMP mainframes. That kind of business is dishonest in my book.

          Kinda like buying a V8 car with four plug wires unplugged and a locked hood. The dealer is demanding "licenses" to connect the other four and threatening you with the DMCA if y
      • What, have you never head of HDCP [digital-cp.com] before? You haven't been shopping for an HDTV lately, I take it. In fact, the new HDTV tuner cards from ATI and others all have HDCP support to prevent you from bypassing the damned digital broadcast flag.
        • The question I have is this: if you use the linux-based dvr configs, will they support this card without the DRM technology, or is the flag pervasive enough that it cannot be avoided?
  • by Uma Thurman ( 623807 ) on Thursday January 29, 2004 @12:30PM (#8124355) Homepage Journal
    Analog connectors and signals are good enough for now. I run an LCD in 1280x1024 resolution and it's fine, though I had to use a good quality cable. Once resolutions go higher, then the digital signals will become more important. The DVI connectors haven't failed, it's just that they haven't succeeded yet. They will. Right now, the cables cost more money too, and that's always a factor.
  • They probably do not understand the benefits to switching, and DVI is only offered on higher end monitors. The average buyer differentiates LCD screens largely on size and is not interested in a 15" monitor that costs $50 less than this 17" one right next to it.
    • The question remains why a DVI (only) LCD should be/is *more* expensive than one with analog only. I would imagine that DVI actually makes it easier since a) the video card doesn't need to do digital-to-analog conversion obsoleting RAMDACs and such and the LCD doesn't have to do analog-to-digital to drive the (I assume) digital LCD panel. I remember reading somewhere that some IBM flatpanels actually had to use blowers to cool the ADCs.
      • I think that the expense comes not so much from the connector, but from a better screen. From what I've seen you cant get DVI on the low end stuff, but it is a common feature on more expensive screens. Would assume that since most screens that feature DVI connections continue to support analog connections that there is some additional cost. I've also never seen a motherboard with onboard video that supported DVI, although those might exist, and outside of the build it yourself world most people buying a
    • by hirschma ( 187820 ) on Thursday January 29, 2004 @02:09PM (#8125497)
      As many posts have covered, it costs more for a manufacturer to offer DVI. So as a result, VGA continues to be the default offering, despite the fact that it sucks.

      Even 10 years ago higher ended monitors and cards came with BNC connectors. Why? Because the VGA connector isn't meant to deal with high res graphics - you start getting crosstalk between pins, and that shows up as visual artifacts. IBM designed the standard for 640x480x256 colors. It wasn't meant to scale to high-res 32 bit.

      Consumers, however, won't spend an extra $50-$100 to get the better visuals. I'm all DVI, and the quality difference is substantial. Sharper text, no ghosting, more vivid colors. Generally easier installs, too.

      At any rate, you're dealing with a consumer base that choose VHS, wouldn't spend for SCSI, and won't spring for a Mac. They don't know the difference, and they don't _want_ to know about the difference. And the marketplace has responded.

      Jonathan
      • Please justify that the VGA DSub connector sucks. I have a 1600x1200x32bit display here via DSub, and it looks great - nice and bright, clear lines, good contrast, no weird artifacts.
        Don't be such a snob. I'm sure you're someone who can hear the difference between FLAC and WAV.
        PS. You post carries no extra weight if you 'sign' your name to it.
        • Justify? How about I've done a side-by-side comparison, and the DVI connected model was visibly superior - to me, and others in the household? Same video card, same monitor (I have two), DVI better.

          It isn't a status thing - it is a quality thing. Just because you can't see it doesn't mean that it doesn't exist.

          Jonathan
      • Really depends on the cable, even on a KVM if you have cheap cables you can have ghosting. A good SVGA cable will run around 25 bux, a cheap VGA will run about 7. Theres a reason you use quality connections.

        But if you are talking DSUB vs DVI on LCD, you could easily tell the difference. But on my nice Sony 22 inch monitor, the picture is far better that a 19inch NEC LCD.

        You really need to see an LCD/Monitor in action, plus higher end LCD's have 3 connections, 2 DVI and 1 DSUB, which really rocks if you do
  • when it comes to continuing to use old hardware. I have an old Sony monitor that's just about good as new. It has a D-Sub connector.

    Why would I want to throw out the monitor or buy an overpriced adaptor just so I could use some new-fangled connector?

    D-Sub will continue until the last D-Sub-using monitor is junked. Because, after all, there is no real reason to junk it; the part's trivial in cost, and people are used to it. Getting rid of it is just one more reason for people to decide not to buy what
    • Back in the 70's I used to want to firebomb 'Stereo Stores.' They were full of slick fascistic 'salesmen' who would sneer when I came in the store needing another goddamn connector off the pegboard display in the back. I knew more about what they were selling, but I wasn't a pimp-slime so they'd cop an attitude because I wasn't wearing a suit and didn't look like an ignorant beatnick-bearing 'intellectal' which made it obvious I wasn't going to buy their overpriced shit.

      I've always since said 'fuck' to p
      • There was a big honking 'superior' video connector stuck on the back of Power Macs back in the late Nu-Bus era. Glad that one died and went away as well.

        and now we have ADC... DVI+power+USB.... oooohhh!!!!

        happily, an ADC -> DVI adapter is only about $15 (even if DVI -> ADC adapters are hella expensive)
        • Yes, but ADC is what we WANT! It's unique in that it uses standard DVI spec for analog and digital video + it has powered usb connections...allowing you to control the monitor directly from the computer, or to plug in keyboards, monitors, without ANOTHER plug running to your desk....

          It's quick and simple to convert VGA or DVI to ADC...[the other way around is tough though due to apple using USB to control the monitors] There's no reason for everybody not to use the best and sell a $10 retail doggle to "du

          • There's no reason for everybody not to use the best and sell a $10 retail doggle to "dumb" it down

            Other than patents that Apple has every right to refuse to license?

            • You've got a point....although I don't think it's as much that Apple will refuse as that board makers simply won't pay ANYBODY royalties....that was the situation with Firewire...Untill apple stopped collecting royalties (only like $1 per port! not much) everybody refused to put them on machines except sony...an apple partner already.

              You may be right that they don't want to give up their "edge" for macs...but I thought that the original DVI implementation spec supported the USB uplink [i.e. Apple's vers

      • The D-15 connector Apple had on the back of the Power Macs (and in fact every non-toaster since the Mac II) wasn't particularly superior. But the VGA form factor wasn't much of a standard when it was used. The 13W3 Sun used was superior, but that's another story.
  • by a.koepke ( 688359 ) on Thursday January 29, 2004 @12:32PM (#8124378)
    I remember when I used to work in a computer store around 1998->1999. We started getting systems with USB header pins on the board and support for it in the BIOS but no connectors and no devices. It is only in the last couple of years that USB has really taken off.

    My video card has a DVI connector and standard D-Sub on it but my LG 19" inch has no DVI connection. I have yet to use it. Until the displays start featuring it on them its not much point having it on the computer. Also its not much point having displays that use DVI without having many systems supporting it. I would almost say that each one is waiting on the other :)

    Also you need to look at the upgade cycle. Not everyone is a computer geek, not everyone has the latest graphics cards and computer gear. When new technology gets released it will take a while to penetrate and become common place.
    • I agree. It is too early to tell if DVI is gonna make it.

      Plus the reason for lack of DVI inputs on most models is simple (and quite strange that a Hungarian born has to tell this to the yankees).

      They sell what you buy!

      If all of us started to buy panels that have DVI connectors, the manufacturers would start curning out panels with DVI...
    • And you know, you can thank Apple for that. The general modus operandi for the x86 industry is:

      1) See what Apple does
      2) Copy it
      3) Profit!

      The examples are endless: WIMP user interface, funky cases, USB for mouse and keyboard, firewire, no floppy drives, etc, etc, etc.

      Thank god for that company. If it wasn't for them, we'd all still be using ISA cards, PS/2 connectors, and D-sub 15's.!

      -mike (who's never owned an Apple in his life)
      • Everything but the PC has out-autoconfigured the Macintosh. Amiga comes to mind, but so does Sun (Openboot with Sbus, or for that matter Sun VME, anyone?) Apple did a good job with ADB, but basically no one bothered to copy it until USB came along because it was cheap and no one believed they could sell the average PC keyboard for fifty bucks. Apple is doing it now with USB though, and guess what? I'm typing on one right now, on my PC :) I really like it, as much as I like any of these qwerty keyboards desi
  • PC Connector Soup (Score:4, Informative)

    by bellings ( 137948 ) on Thursday January 29, 2004 @12:39PM (#8124451)
    I don't think anyone knows why motherboards come with the connectors they come with.

    Why can't I buy a motherboards without a serial port, a parallel port, two ps/2 ports, and a line-in audio port? Why do motherboards come with built in video, but not bluetooth and wireless networking?

    Why isn't there a standard for external power supplies, instead of having a blasting-hot power supply inside the temperature sensitive case, while a half-dozen wall-warts hang off my power supply driving all my peripherals?

    In short, why are PC compatibles such heaps of shit?
    • In short, why are PC compatibles such heaps of shit?

      It's called 'open standards' and 'not every piece designed as one unified whole by a single vendor.'

      Do you carp and whine that everybody should have one of those shit VCR/TV combos, too, because of the lack of dangling signal cords?
    • by DarkDust ( 239124 ) <marc@darkdust.net> on Thursday January 29, 2004 @12:56PM (#8124638) Homepage

      In short, why are PC compatibles such heaps of shit?

      In short, because they still try to be compatible with a 20 year old machine that was a quick-shot and intended to be replaced by something better... but it wasn't replaced since the quick-shot gained too much market capacity.

      The funny thing is that not only the IBM PC itself was just intended to be an interim solution but the processor (8086) was as well ! Intel wanted to do something better but felt it had to react to competition and thus released quickly made the 8086 just to have something.

      And then people began to build even more and more stores onto this messy ground (PCI, AGP, ACPI, APIC, and the most famous: the A20 gate, just to name a few extensions) and now we have an architecture so horrible, complicated and full of unnecessary stuff that it's a real wonder that most PCs run quite well...

      I've been saying this for years: it's time to start from scratch and cut that damn downward compability. But Windows only runs on Intel systems, that's a problem worth another discussion. If we'd start from scratch and throw the 20 year old dirt over board not only would computers be faster, they would also be cheaper and more reliable (because implementors wouldn't have to implement all those warts and bugs that some software now depends on).

      • Yeah, then we'll redesign the Traffic system, too. But wait, so many people rely on the Traffic system working the way it does already -- and the number of intersections/roads to change would be immense. I guess it's just not going to happen. Kind of like your suggestion. Backward compatibility is there for a reason. Hundreds of thousands of people already rely on what's in place.
        • by DarkDust ( 239124 )

          Yeah, then we'll redesign the Traffic system, too. But wait, so many people rely on the Traffic system working the way it does already -- and the number of intersections/roads to change would be immense. I guess it's just not going to happen. Kind of like your suggestion. Backward compatibility is there for a reason. Hundreds of thousands of people already rely on what's in place.

          Well, UNIX is also backwards compatible, at least on the source level. And if you don't use assembler or bit-modifying C code

          • By the end of the year you will be able to run .Net applications natively on IA64 and AMD64 systems.
            By the time Longhorn comes out most new applications will be written for .Net to take advantage of the new OS features.
            NT itself was originally designed to be highly portable, though that may have been corrupted somewhat in subsequent generations.
            The transition to a different architecture will be slow but I think it will happen over the next decade or so.
      • The problem is that Intel has screwed up everything "better" than the 8086 line. Both the i432 and i860 were beautiful designs that didn't work in practice. The i860 never had a chance, and the Itanium is running the risk of joining the i432 and i860 lines as neat-but-dumb ideas.

        The biggest problem is that working around all of these things only needs to be done twice. Once for Windows, once for Linux, and everybody's happy. With mass-market chipsets the way they are, they've been copy-and-pasting the

      • It's not hard at all to make a motherboard (and hence a whole PC) which is completely standards-compliant and backwards-compatible in every meaningful and useful way, but which has no legacy busses or ports. The ISA bus has been internalized on most modern motherboards for some time. Getting rid of parallel, serial, and PS/2 mouse/keyboard would be an easy next step. Floppy drive busses/controllers, as well as the floppy drive itself, can also take a hike. Replace it with an optional IDE LS-120 drive fo
    • Re:PC Connector Soup (Score:3, Informative)

      by Komarosu ( 538875 )
      Why can't I buy a motherboards without a serial port, a parallel port, two ps/2 ports, and a line-in audio port?

      You can, its called a ABIT legacy-free motherboard [tech-report.com] :)
      • This is the same Abit who recently brought out a motherboard with a tube amplifier on board for that authentic sound with their onboard sound. A real honest-to-god vacuum tube.

    • Because you keep buying!
    • You're thinking of the advantages of centrally standardized PCs, but none of the disadvantages.

      Dump the serial port? Then what do I plug my external modem into? Or do you expect me to buy a new one just so your precious sensibilities aren't offended? (just one example...)

      If Intel (or Microsoft, or anyone else) where in charge of what hardware I could use, I would be royally screwed. I've got an Intel motherboard that considers USB 1.1 to be "legacy" hardware. Legacy! If Intel had its way, I couldn't upgra
      • Real men have 8port Comtrol serial cards (with a few extra 4port cards in the closet, just in case). My only gripe is, I can't find a 4(+) port ECP parallel port card.

        Though in all honesty, I wouldn't cry if PS/2 mouse/kb ports died a horrible, ugly death...
      • Dump the serial port? Then what do I plug my external modem into? Or do you expect me to buy a new one just so your precious sensibilities aren't offended?

        A reasonable interpretation of your parent post would be that the poster would like legacy-free boards to be available in addition to the current legacy-encumbered crap we get now. I don't think he said anything about not selling serial ports at all anymore.

        Personally, I agree with him. It's been years since I last used a serial or parallel port. It's
      • Dump the serial port? Then what do I plug my external modem into?

        You buy a used Portmaster 2-er for $31 on EBay and connect 30 modems to it.

      • Dump the serial port? Then what do I plug my external modem into? Or do you expect me to buy a new one just so your precious sensibilities aren't offended? (just one example...)

        Well, you could buy a $5 serial port card. If you think about it, it makes sense to do that rather than give serial ports to everyone, even though 99% of them aren't going to need it.

        With the push to make computers smaller and more attractive, getting rid of ugly oldschool connectors that almost nobody uses makes sense.

    • Why can't I buy a motherboards without a serial port, a parallel port, two ps/2 ports, and a line-in audio port? Why do motherboards come with built in video, but not bluetooth and wireless networking?

      In short, mass production/commoditisation. Motherboards have for a long time been conformant to first the AT standard, then ATX, and now new formats are emerging. These specify the form factor and the standard connectors, and where they are located. This enables any motherboard to be used with any case, an

  • by dbirchall ( 191839 ) on Thursday January 29, 2004 @12:46PM (#8124529) Journal
    I bought a (NEC/)Mitsubishi DiamondPoint NM56LCD panel last July at OfficeMax. 15", D-SUB _and_ DVI inputs. Why that one, instead of some cheaper Planar POS at WalMerde?

    Well, I needed the D-SUB, 'cos I knew I'd be hooking it to an iBook, and all consumer-market Apple products come with VGA out, rather than DVI.

    But I also planned to use it as the second monitor on a Power Mac G5 months later - and allcurrent professional-market Apple products come with DVI out (at least - the Radeon 9800 in my G5 has an ADC connector for an Apple Cinema display, and a DVI out for whatever else I want.)

    Folks who say it's "high-end" are pretty much right. It's something the UXGA (1600x1200) and WUXGA (1920x1200 like my Cinema) folks have a lot more use for than the 1280x1024 folks. Right now, that's largely the pro market still.

    When I can walk into WalMerde and see even a single DVI connector, then I'll know it's achieving mass-market penetration.

    • Does DVI-D even support WUXGA? I vaguely recall the DVI spec ending at 1600x1200, which is (in theory) one reason apple is going with ADC.
      • According to http://www.ddwg.org/dvi.html single-link DVI supports up to 1920x1080 @ 60Hz; dual-link DVI supports up to 2048x1536 @ 60Hz. Since WUXGA is 1920x1200, the non-Apple monitors that support it (Sony has a 23" one, same panel as Apple's, for about $100 or so more with more inputs) must be using dual-link DVI, I guess. Of course if you want to run an IBM T220, you'll need some other connection entirely... ;)
  • As has already been mentioned, dual inputs for DVI and D-SUB are becoming common on the better LCD panels. In many cases having D-SUB as an option allows for solutions to problems that are good enough and a lot less expensive.

    Whenever I get around to buying an LCD monitor it will definitely have both, but that won't be until I can buy an LCD that is larger than my 19" CRT for less than I paid (about $400 at the time).
  • dsub is cheaper (Score:3, Insightful)

    by beegle ( 9689 ) on Thursday January 29, 2004 @01:03PM (#8124725) Homepage
    DVI connectors, cables, and designs still cost more than VGA -- both for the consumer and for the producer. On top of that, there's a chicken-and-egg problem where people need DVI to VGA converters for compatability with existing equipment.

    If my video card and your video card have the same chipset, but you use VGA and I use DVI, my card'll cost a little more. If I include a DVI to VGA adapter, it'll cost even more. Since our cards have the same chips, most people will buy your card.
  • Where are the KVMs? (Score:3, Interesting)

    by h3 ( 27424 ) on Thursday January 29, 2004 @01:23PM (#8124921) Homepage Journal
    What I wanna know is where are the DVI-based KVM switches? I was recently in the market for one and couldn't find any. A 4-port DVI+USB would've been my ideal, but alas such a thing doesn't exist afaik and that's why I'm still using VGA.

    -h3

    • Belkin makes a model [belkin.com], but it ain't cheap. MSRP is $325, and it comes with no cables. The cables are $80 each. So if you want to connect 4 PC's, that's a whopping $645.
      • Ouch! I should've been more specific as I certainly couldn't afford spending that for my home systems. In fact, that would've cost *more* than 2 of the systems I'd have connected to it :p.

        I wound up with a VGA+USB one from Iogear for about $110 including cables...

        -h3
    • StarTech [startech.com] may have what you're looking for. DVI, USB, dual-display (analog), etc.

      This has not been a paid advertisement for any company.

    • Newegg [newegg.com] has a couple. Unfortunately they don't switch the sound like my current Belkin model does.

      My new 20" LCD monitor should be here today and it will have a DVI input. I'm debating the purchase of a DVI KVM because I run dual monitors and switch both of them with two KVMs. I'm going to try out the new beast without DVI and see how it looks before I drop the cash for a new KVM.
  • by Gadzinka ( 256729 ) <rrw@hell.pl> on Thursday January 29, 2004 @01:25PM (#8124937) Journal
    Why has DVI been a relative failure in the market?

    I was under the impression that specs for the digital part of DVI interface didn't let it show eg 1600x1200 resolution in any sensible refresh rate. I distantly recall reading some years ago about plans of some sort of HR-DVI that would address this isue, but never heard about it again.

    Could someone knowing exact specs correct me?

    Robert
  • Aparently HM Customs in the UK are about to increase import duty on DVI monitors, thus further pushing up the price.
  • by Anonymous Coward on Thursday January 29, 2004 @03:21PM (#8126374)
    To see where the industry is going, take a look at Apple. The technologies Apple uses today, will be the main stream technology I few years down the road in the PC universe..

    Cases in point: The Mouse, The Graphical User Interface, 32-bit processors, Color Displays (8-bit), True Color Displays (24-bit+), CD drives in every computer, USB.

    Apple didn't invent any of it, they were just one of the earliest adopters. But these technologies are now used in almost all PCs you can buy today.

    Apple Today: Digital only display connectors (DVI, mini-DVI, ADC) (pro systems), CD-R/DVD drives (every system but 1), 64-bit processors (Powermac/Xserve lines), wireless networking...

    The smart bet is all these technologies will be common place in the PC industry one day.
    • Also... (Score:3, Informative)

      by sbszine ( 633428 )
      Great post. Add FireWire and widescreen displays to that list.
    • Yeah, that's pretty easy to do when you control the hardware, the software and the market for both.

      In the PC world it's somewhat harder to include some piece of hardware that hardly anyone's going to use when your competitor can leave it out and charge $50 less for effectively the same machine.

  • by Tom7 ( 102298 )
    most LCDs still have only an analog input

    What? An LCD with analog input is entirely missing the point. I know they exist, but I don't think they're the majority.
    • Go take a look at LCDs on the market. Especially in the low end, the vast majority have only VGA input. It's exceedingly retarded, but whatchagonnado.
    • What? An LCD with analog input is entirely missing the point. I know they exist, but I don't think they're the majority.

      I wouldn't know about the majority, but the LCD I bought a few months ago has only an analogue input. This is a Philips 150B4, 15" at 1024*768 - maybe not exactly high end, but it fits nicely on the desk. The model above in that range, a 17", does have a DVI input.

      The annoying thing is that the graphics card in my main machine, an ATi Radeon something-or-other, has a DVI output (only),

  • I suspect the market for DVI/HDMI will be driven not by PCs but at first by HDTV monitors. My Panasonic 42" Plasma has both D-Sub and DVI connectors. DVD players with DVI/HDMI outputs are becoming more numerous. It won't be long before PC manufacturers will feel the need to catch up in the name of 'convergence'.
  • There is a huge installed base of working VGA monitors with the 15-pin sub-D connection. My Multiscan 500PS is a big old expensive beautiful monitor and it's not going anywhere. As it happens my new-ish video card has VGA, DVI and S-Video and can output on any two at a time. I have my Sony monitor and a Philips TV hooked up at the moment. It also comes with a DVI-to-VGA adapter, which was used to try and debug what turned out to be a documentation error.

    At work I'm getting dual-head cards in all the new

  • DVD playback (Score:5, Interesting)

    by Yrd ( 253300 ) on Thursday January 29, 2004 @06:25PM (#8128603) Homepage
    I know this is nuts, but Macrovision-protected DVDs don't play on my Windows box when my LCD is plugged into the DVI output on my graphics card.

    Don't you just love Microsoft? The problem, they say, is some failure to initialise analogue copy protection. I assume a Mac will play the same disk over a digital monitor line, so all we can say here is Windows is poo. But until that kind of thing works, DVIs aren't going to work for mass-market.

    Or is there just something horribly horribly wrong with my system?

    Not that it's much of a problem, I just watch those DVDs with Xine :-)

According to the latest official figures, 43% of all statistics are totally worthless.

Working...