Why Hasn't the DVI Interface Replaced D-Sub? 156
nic1m asks: "When DVI connectors started appearing on video cards I thought they were a smart replacement for the old D-Sub analog connector because DVI can support both digital and analog displays. With LCDs rapidly gaining market share I would have expected DVI to replace D-Sub by now. Almost the opposite seems to be happening, however. Many video cards still lack DVI, most LCDs still have only an analog input, and motherboard-based graphics never have DVI. Why has DVI been a relative failure in the market?"
Simple (Score:5, Informative)
Most good Flat Panel displays (Hitachi, Sony, etc 17" and up) do support DVI - but DVI on Analog CRTs doesn't make much sense.
Re:Simple (Score:4, Interesting)
Part of the problem is that, in fact, many LCDs do NOT come with DVI connections. You say "most good flat panel displays" do, though that's not quite accurate. You mention Sony 17" and up... well, this 17" Sony doesn't use DVI [sonystyle.com], nor does this 19" Sony [sonystyle.com]. Or how about this 24" Samsung [samsung.com], which includes connections for D-Sub, S-Video, RCA, Component (x2), Coax, and Scart (but no DVI) and will set you back $3-4k.
The fact is, contrary to popular belief, the majority of LCDs still do not come with DVI, whether budget or high-end. I learned this during Christmas when I had to shop for an LCD for my mother. Sadly, often times if you want a DVI connection, you pay MORE than the identical model which uses a D-Sub connection.
Which brings us back to the original post... WHY is this?? Doesn't DVI on a video card or LCD mean not having to use a DAC on the hardware? Which you would think would cut costs?? Not to mention DVI provides better quality to an LCD than D-Sub does... you would think monitor manufacturers, at least, would appreciate making their hardware seemingly perform better while saving money??
Hopefully someone will have some insightful knowledge to clue us in on the this seemingly backwards situation.
Re:Simple (Score:2)
Re:Simple (Score:2)
Re:Simple (Score:1)
Re:Simple (Score:2)
Re:Simple (Score:1)
The bottled water that your parent is talking about, however, is the bottled distilled water you buy at the supermarket.
I believe it comes in 10 or 20L bottles, and they're cheap... much cheapter than gasoline by volume.
Re:Simple (Score:3, Insightful)
It's the monitors that need backward compatibility. Unless I'm
Re:Simple (Score:2)
To solve that problem, you have to have both a digital and analog thingy in the monitor.
Re:Simple (Score:3, Informative)
Re:Simple (Score:5, Informative)
Wrong. You've been misled. DVI at half bandwidth can travel 30 feet on a good cable. (half bandwidth currently is how computer data and HDTV are transmitted via DVI) DVI at full bandwidth travels 15 feet. And a limit of 1024x768? Not true at all. I'm running two 30' DVI cables. One to a projector sending 720p (1280x720) and one to an LCD display @ 1280x1024.
DVIgear.com's salesmen are good aren't they?
Re:Simple (Score:3, Informative)
Picture looks great (and a pretty cool setup too as long as you don't need to change CDs...)
Re:Simple (Score:3, Interesting)
Re:Simple (Score:2)
Re:Simple (Score:2, Informative)
I think it does. The DVI spec allows for analog signals. Check the pinouts for DVI connector. Apple only has DVI/ADC , and they include a VGA adapter for the ADC. I also own a DVI to VGA adapter and it works just fine.
DVI should replace VGA.
(ADC is an Apple only connector that is basically DVI with USB and power pins included).
Re:Simple (Score:2)
Re:Simple - 6' on Analog high res... (Score:2)
Dell has not been totally NON-DVI (Score:3, Insightful)
So I guess the question should be, why has DVI been so slow to penetrate the low-end/mainstream/low cost market? I imagine the DVI connector is a more complicated part and more expensive to produce, especially if you're like Dell and using weird connectors that require the extra expense of dongles. Additionally, if all you use is a DVI you have to include a VGA adapter, another item that slowly pecks away at the bottom line.
Let's face it, for most hardware manufacturers what's cheaper than simply using the old tried and true D-sub VGA connector?
Re:Dell has not been totally NON-DVI (Score:4, Informative)
Manufacturers are catering to the lowest common denominator- the "good enough" theory in action. This is also why the market is being flooded with cheap 42" plasma displays, that only have 864x480 (ie, non-HDTV) resolution, often without DVI. Most people just want the sexy thin screen, but don't care or don't realize how low the resolution is, and what they're missing out on.
DVI isn't necessarily in the domain of high-end, but you have to look a little harder to get it.
Re:Dell has not been totally NON-DVI (Score:2)
Most people probably aren't missing out on anything. I recently bought a 42" HD monitor, but I don't have any HD video to watch on it. I mostly bought it because I mainly use my TV for watching movies and I prefer 16:9. I doubt I would notice much difference between 1920x1080 and 864x480 in my typical usage, except perhaps that my SD cable signal might not look quite as cr
Re:Dell has not been totally NON-DVI (Score:2)
Definitely. I have two of their 20" LCD monitors on my desk at work. They're pretty nice displays: bright and crisp. I like the Samsung I have at home a little bit more because it has slightly better contrast, but the Dells are quite nice. The ones we have at work have 4 (!) inputs: DVI, D-Sub, S-video, and composite. It looks great with the DVI input but is a bit fuzzy if you use the D-Sub input, even at the sam
That's odd... (Score:2)
When I bought it about a year ago, I paid $530ish for it.
The cheapest non-Dell 17" LCD with DVI I could find was over $600.
So I paid $70 less, got DVI, and got an extra inch of diagonal...
HDMI (Score:2, Informative)
http://www.hdmi.com
Re:HDMI (Score:3, Informative)
HDMI = DRM = Patented != Fair Use Right (Score:3, Funny)
"Windows has detected you want to connect a high resolution display to your computer. Your current Windows license doesn't allow displays over 1024x768. If you wish to upgrade, please insert appendage you wish to pay with..."
Re:HDMI = DRM = Patented != Fair Use Right (Score:2)
Sorry, I don't think this is an issuue because it is always possible to prevent non users of DRM from setting their display resolution above a certain size. I mean, you do have to configure windows to set the display resolution. If they don't want to support that resolution it doesn't
Re:HDMI = DRM = Patented != Fair Use Right (Score:1)
But I do see how HDMI could enable an unscrupulous scumbag^H^H^H^H^H^H^H companies to sell you a high-res screen, then sell you "licenses" to the different resolutions. Kinda like IBM does with crippled SMP mainframes. That kind of business is dishonest in my book.
Kinda like buying a V8 car with four plug wires unplugged and a locked hood. The dealer is demanding "licenses" to connect the other four and threatening you with the DMCA if y
So does DVI (Score:2)
Re:So does DVI (Score:2)
Re:HDMI = DRM = Patented != Fair Use Right (Score:1)
DVI does NOT. HDCP, which is on top of DVI, does.
Analog good enough for now (Score:3, Insightful)
Re:Analog good enough for now (Score:2)
People are cheap (Score:1)
Re:People are cheap (Score:1)
Re:People are cheap (Score:1)
Because people are cheap and don't care (Score:4, Insightful)
Even 10 years ago higher ended monitors and cards came with BNC connectors. Why? Because the VGA connector isn't meant to deal with high res graphics - you start getting crosstalk between pins, and that shows up as visual artifacts. IBM designed the standard for 640x480x256 colors. It wasn't meant to scale to high-res 32 bit.
Consumers, however, won't spend an extra $50-$100 to get the better visuals. I'm all DVI, and the quality difference is substantial. Sharper text, no ghosting, more vivid colors. Generally easier installs, too.
At any rate, you're dealing with a consumer base that choose VHS, wouldn't spend for SCSI, and won't spring for a Mac. They don't know the difference, and they don't _want_ to know about the difference. And the marketplace has responded.
Jonathan
Justify? (Score:2)
Don't be such a snob. I'm sure you're someone who can hear the difference between FLAC and WAV.
PS. You post carries no extra weight if you 'sign' your name to it.
Re:Justify? (Score:2)
It isn't a status thing - it is a quality thing. Just because you can't see it doesn't mean that it doesn't exist.
Jonathan
Re:Because people are cheap and don't care (Score:2)
But if you are talking DSUB vs DVI on LCD, you could easily tell the difference. But on my nice Sony 22 inch monitor, the picture is far better that a 19inch NEC LCD.
You really need to see an LCD/Monitor in action, plus higher end LCD's have 3 connections, 2 DVI and 1 DSUB, which really rocks if you do
People are conservative ... (Score:2)
Why would I want to throw out the monitor or buy an overpriced adaptor just so I could use some new-fangled connector?
D-Sub will continue until the last D-Sub-using monitor is junked. Because, after all, there is no real reason to junk it; the part's trivial in cost, and people are used to it. Getting rid of it is just one more reason for people to decide not to buy what
Re:People are conservative ... (Score:1)
I've always since said 'fuck' to p
Re:People are conservative ... (Score:2)
and now we have ADC... DVI+power+USB.... oooohhh!!!!
happily, an ADC -> DVI adapter is only about $15 (even if DVI -> ADC adapters are hella expensive)
Re:People are conservative ... (Score:2)
It's quick and simple to convert VGA or DVI to ADC...[the other way around is tough though due to apple using USB to control the monitors] There's no reason for everybody not to use the best and sell a $10 retail doggle to "du
Patents (Score:2)
There's no reason for everybody not to use the best and sell a $10 retail doggle to "dumb" it down
Other than patents that Apple has every right to refuse to license?
Re:Patents (Score:2)
You may be right that they don't want to give up their "edge" for macs...but I thought that the original DVI implementation spec supported the USB uplink [i.e. Apple's vers
Re:People are conservative ... (Score:2)
I think its too soon to be crying out its dead (Score:4, Insightful)
My video card has a DVI connector and standard D-Sub on it but my LG 19" inch has no DVI connection. I have yet to use it. Until the displays start featuring it on them its not much point having it on the computer. Also its not much point having displays that use DVI without having many systems supporting it. I would almost say that each one is waiting on the other
Also you need to look at the upgade cycle. Not everyone is a computer geek, not everyone has the latest graphics cards and computer gear. When new technology gets released it will take a while to penetrate and become common place.
Re:I think its too soon to be crying out its dead (Score:2)
Plus the reason for lack of DVI inputs on most models is simple (and quite strange that a Hungarian born has to tell this to the yankees).
They sell what you buy!
If all of us started to buy panels that have DVI connectors, the manufacturers would start curning out panels with DVI...
Re:I think its too soon to be crying out its dead (Score:2)
1) See what Apple does
2) Copy it
3) Profit!
The examples are endless: WIMP user interface, funky cases, USB for mouse and keyboard, firewire, no floppy drives, etc, etc, etc.
Thank god for that company. If it wasn't for them, we'd all still be using ISA cards, PS/2 connectors, and D-sub 15's.!
-mike (who's never owned an Apple in his life)
Re:I think its too soon to be crying out its dead (Score:2)
PC Connector Soup (Score:4, Informative)
Why can't I buy a motherboards without a serial port, a parallel port, two ps/2 ports, and a line-in audio port? Why do motherboards come with built in video, but not bluetooth and wireless networking?
Why isn't there a standard for external power supplies, instead of having a blasting-hot power supply inside the temperature sensitive case, while a half-dozen wall-warts hang off my power supply driving all my peripherals?
In short, why are PC compatibles such heaps of shit?
Re:PC Connector Soup (Score:1)
It's called 'open standards' and 'not every piece designed as one unified whole by a single vendor.'
Do you carp and whine that everybody should have one of those shit VCR/TV combos, too, because of the lack of dangling signal cords?
Re:PC Connector Soup (Score:5, Insightful)
In short, why are PC compatibles such heaps of shit?
In short, because they still try to be compatible with a 20 year old machine that was a quick-shot and intended to be replaced by something better... but it wasn't replaced since the quick-shot gained too much market capacity.
The funny thing is that not only the IBM PC itself was just intended to be an interim solution but the processor (8086) was as well ! Intel wanted to do something better but felt it had to react to competition and thus released quickly made the 8086 just to have something.
And then people began to build even more and more stores onto this messy ground (PCI, AGP, ACPI, APIC, and the most famous: the A20 gate, just to name a few extensions) and now we have an architecture so horrible, complicated and full of unnecessary stuff that it's a real wonder that most PCs run quite well...
I've been saying this for years: it's time to start from scratch and cut that damn downward compability. But Windows only runs on Intel systems, that's a problem worth another discussion. If we'd start from scratch and throw the 20 year old dirt over board not only would computers be faster, they would also be cheaper and more reliable (because implementors wouldn't have to implement all those warts and bugs that some software now depends on).
Re:PC Connector Soup (Score:1)
Re:PC Connector Soup (Score:3, Insightful)
Yeah, then we'll redesign the Traffic system, too. But wait, so many people rely on the Traffic system working the way it does already -- and the number of intersections/roads to change would be immense. I guess it's just not going to happen. Kind of like your suggestion. Backward compatibility is there for a reason. Hundreds of thousands of people already rely on what's in place.
Well, UNIX is also backwards compatible, at least on the source level. And if you don't use assembler or bit-modifying C code
Re:PC Connector Soup (Score:2)
By the time Longhorn comes out most new applications will be written for
NT itself was originally designed to be highly portable, though that may have been corrupted somewhat in subsequent generations.
The transition to a different architecture will be slow but I think it will happen over the next decade or so.
Re:PC Connector Soup (Score:2)
The biggest problem is that working around all of these things only needs to be done twice. Once for Windows, once for Linux, and everybody's happy. With mass-market chipsets the way they are, they've been copy-and-pasting the
Re:PC Connector Soup (Score:2)
It's not hard at all to make a motherboard (and hence a whole PC) which is completely standards-compliant and backwards-compatible in every meaningful and useful way, but which has no legacy busses or ports. The ISA bus has been internalized on most modern motherboards for some time. Getting rid of parallel, serial, and PS/2 mouse/keyboard would be an easy next step. Floppy drive busses/controllers, as well as the floppy drive itself, can also take a hike. Replace it with an optional IDE LS-120 drive fo
DosBox is your friend (Score:3, Funny)
Re:PC Connector Soup (Score:3, Informative)
You can, its called a ABIT legacy-free motherboard [tech-report.com]
Re:PC Connector Soup (Score:2)
Answer (Score:1)
Re:PC Connector Soup (Score:3, Interesting)
Dump the serial port? Then what do I plug my external modem into? Or do you expect me to buy a new one just so your precious sensibilities aren't offended? (just one example...)
If Intel (or Microsoft, or anyone else) where in charge of what hardware I could use, I would be royally screwed. I've got an Intel motherboard that considers USB 1.1 to be "legacy" hardware. Legacy! If Intel had its way, I couldn't upgra
Re:PC Connector Soup (Score:2)
Though in all honesty, I wouldn't cry if PS/2 mouse/kb ports died a horrible, ugly death...
Re:PC Connector Soup (Score:3, Insightful)
A reasonable interpretation of your parent post would be that the poster would like legacy-free boards to be available in addition to the current legacy-encumbered crap we get now. I don't think he said anything about not selling serial ports at all anymore.
Personally, I agree with him. It's been years since I last used a serial or parallel port. It's
Re:PC Connector Soup (Score:3, Funny)
You buy a used Portmaster 2-er for $31 on EBay and connect 30 modems to it.
Re:PC Connector Soup (Score:2)
Well, you could buy a $5 serial port card. If you think about it, it makes sense to do that rather than give serial ports to everyone, even though 99% of them aren't going to need it.
With the push to make computers smaller and more attractive, getting rid of ugly oldschool connectors that almost nobody uses makes sense.
Re:PC Connector Soup (Score:2)
In short, mass production/commoditisation. Motherboards have for a long time been conformant to first the AT standard, then ATX, and now new formats are emerging. These specify the form factor and the standard connectors, and where they are located. This enables any motherboard to be used with any case, an
DVI is getting there, but it's not mass-market yet (Score:3, Insightful)
Well, I needed the D-SUB, 'cos I knew I'd be hooking it to an iBook, and all consumer-market Apple products come with VGA out, rather than DVI.
But I also planned to use it as the second monitor on a Power Mac G5 months later - and allcurrent professional-market Apple products come with DVI out (at least - the Radeon 9800 in my G5 has an ADC connector for an Apple Cinema display, and a DVI out for whatever else I want.)
Folks who say it's "high-end" are pretty much right. It's something the UXGA (1600x1200) and WUXGA (1920x1200 like my Cinema) folks have a lot more use for than the 1280x1024 folks. Right now, that's largely the pro market still.
When I can walk into WalMerde and see even a single DVI connector, then I'll know it's achieving mass-market penetration.
Re:DVI is getting there, but it's not mass-market (Score:2)
Re:DVI is getting there, but it's not mass-market (Score:2)
Re:DVI is getting there, but it's not mass-market (Score:2)
Re:DVI is getting there, but it's not mass-market (Score:2)
Re:DVI is getting there, but it's not mass-market (Score:2)
Both inputs available for a while yet. (Score:2)
Whenever I get around to buying an LCD monitor it will definitely have both, but that won't be until I can buy an LCD that is larger than my 19" CRT for less than I paid (about $400 at the time).
dsub is cheaper (Score:3, Insightful)
If my video card and your video card have the same chipset, but you use VGA and I use DVI, my card'll cost a little more. If I include a DVI to VGA adapter, it'll cost even more. Since our cards have the same chips, most people will buy your card.
Where are the KVMs? (Score:3, Interesting)
-h3
Re:Where are the KVMs? (Score:2)
Re:Where are the KVMs? (Score:2)
I wound up with a VGA+USB one from Iogear for about $110 including cables...
-h3
Re:Where are the KVMs? (Score:2)
StarTech [startech.com] may have what you're looking for. DVI, USB, dual-display (analog), etc.
This has not been a paid advertisement for any company.
Re:Where are the KVMs? (Score:2, Informative)
My new 20" LCD monitor should be here today and it will have a DVI input. I'm debating the purchase of a DVI KVM because I run dual monitors and switch both of them with two KVMs. I'm going to try out the new beast without DVI and see how it looks before I drop the cash for a new KVM.
Technical failure == market failure. (Score:4, Interesting)
I was under the impression that specs for the digital part of DVI interface didn't let it show eg 1600x1200 resolution in any sensible refresh rate. I distantly recall reading some years ago about plans of some sort of HR-DVI that would address this isue, but never heard about it again.
Could someone knowing exact specs correct me?
Robert
Re:Technical failure == market failure. (Score:2, Informative)
I don't know if the spec has been formalized for it, but most nvidia and ATI cards can support something like 100Hz 1920x1200 now.
This was really more of a problem a year or two ago, as anyone who's got a pricey Viewsonic
Re:Technical failure == market failure. (Score:1)
So exactly how often do you expect me to upgrade my video, and who's going to pay for it?
Re:Technical failure == market failure. (Score:3, Interesting)
Re:Technical failure == market failure. (Score:2)
Re:Technical failure == market failure. (Score:2)
Import Duty (Score:2)
Give it 5 more years... (Score:5, Insightful)
Cases in point: The Mouse, The Graphical User Interface, 32-bit processors, Color Displays (8-bit), True Color Displays (24-bit+), CD drives in every computer, USB.
Apple didn't invent any of it, they were just one of the earliest adopters. But these technologies are now used in almost all PCs you can buy today.
Apple Today: Digital only display connectors (DVI, mini-DVI, ADC) (pro systems), CD-R/DVD drives (every system but 1), 64-bit processors (Powermac/Xserve lines), wireless networking...
The smart bet is all these technologies will be common place in the PC industry one day.
Also... (Score:3, Informative)
Re:Give it 5 more years... (Score:2)
In the PC world it's somewhat harder to include some piece of hardware that hardly anyone's going to use when your competitor can leave it out and charge $50 less for effectively the same machine.
What? (Score:2)
What? An LCD with analog input is entirely missing the point. I know they exist, but I don't think they're the majority.
Unfortunately, they are (Score:2)
Re:Unfortunately, they are (Score:2)
Re:What? (Score:2)
What? An LCD with analog input is entirely missing the point. I know they exist, but I don't think they're the majority.
I wouldn't know about the majority, but the LCD I bought a few months ago has only an analogue input. This is a Philips 150B4, 15" at 1024*768 - maybe not exactly high end, but it fits nicely on the desk. The model above in that range, a 17", does have a DVI input.
The annoying thing is that the graphics card in my main machine, an ATi Radeon something-or-other, has a DVI output (only),
Still coming (Score:1)
Probably because monitors last longer than PCs (Score:2)
At work I'm getting dual-head cards in all the new
DVD playback (Score:5, Interesting)
Don't you just love Microsoft? The problem, they say, is some failure to initialise analogue copy protection. I assume a Mac will play the same disk over a digital monitor line, so all we can say here is Windows is poo. But until that kind of thing works, DVIs aren't going to work for mass-market.
Or is there just something horribly horribly wrong with my system?
Not that it's much of a problem, I just watch those DVDs with Xine
Re:DVD playback (Score:2)
Re:DVD playback (Score:2)