Why is HDMI->DVI image sharper than VGA?

20

4

I have a Dell U2312HM monitor connected to a Dell Latitude E7440 laptop. When I connect them via laptop -> HDMI cable -> HDMI-DVI adaptor -> monitor (the monitor doesn't have a HDMI socket), the image is much sharper than with laptop -> miniDisplayPort-VGA adaptor -> VGA cable -> monitor. The difference is difficult to capture with a camera, but see my attempt at it below. I tried playing with brightness, contrast and sharpness settings, but I can't get the same image quality. The resolution is 1920x1080, I'm using Ubuntu 14.04.

VGA:

VGA

HDMI:

HDMI

Why is the quality different? Is it intrinsic to these standards or should I suspect a faulty VGA cable or mDP-VGA adaptor?

alkamid

Posted 2015-04-10T09:17:20.327

Reputation: 329

6

You're probably playing with the wrong settings. Use a program (or web site) that can generate a test pattern and adjust the clock and phase settings.

– David Schwartz – 2015-04-10T09:52:02.930

11

Keep in mind that both HDMI and DVI utilize the same signal/protocol - TMDS , so the first option is bit-perfect (lossless).

– None – 2015-04-10T16:06:20.777

Obvious solution: get an HDMI monitor – Steven Penny – 2015-04-11T01:09:25.543

Most modern VGA outputs are lowish-quality. That’s because nobody cares anymore. Oh, and you did try the Auto button when using VGA, right? – Daniel B – 2015-04-12T00:07:30.050

To me, the VGA sample is, if anything, slightly sharper than the HDMI sample, so your whole point fails. There is of course a substantial difference in brightness between the samples. – kreemoweet – 2015-04-12T01:26:43.270

1@StevenPenny - How would that help? He already has a sharper image using an HDMI-DVI adapter. As the monitor already has a DVI input, why would a HDMI input be necessary? Please expand the reasoning behind your comment. – Greenonline – 2015-04-12T06:53:36.770

Answers

44

VGA is the only analog signal from the above mentioned ones so it's already an explanation for difference. Using the adapter can further worsen your situation.

some further reading: http://www.digitaltrends.com/computing/hdmi-vs-dvi-vs-displayport-vs-vga/

Máté Juhász

Posted 2015-04-10T09:17:20.327

Reputation: 16 807

2One big point to make here is that almost VGA cables are trash. You can get relatively decent image quality with VGA, but the cable would need to be about 3x thicker than what they usually bundle with displays/TVs. – analytik – 2015-04-13T06:18:41.937

@analytik I will just go ahead and say that you are almost wrong. For the cable to be thicker, there must be a cable first... They just ship the TV's and (sometimes) the screens with only the power cable and you already are a lucky person. If they could, they would sell it without any cable! – Ismael Miguel – 2015-04-13T09:26:10.913

@IsmaelMiguel well, I guess that depends on where you live, so we both are wrong in a way. For all the CRT and LCD screens I've bought and sold in 3 different European countries, the habit is to package a D-SUB cable, even when the screen supports DVI/HDMI/DisplayPort. Although I admit I haven't sold many LCD TVs, so the custom there might be different. – analytik – 2015-04-17T08:00:52.987

@analytik I live in Portugal and usually the CRT screens had the cable attached to them and it was rare to see a screen with a detachable cable. Sometimes, the LCD screens come with a very low-quality VGA cable. But LCD TVs and Plasma TVs and others only come with the power cable/adapter. Sometimes, the power cable comes attached and can't be removed. But that's rare. – Ismael Miguel – 2015-04-17T08:31:21.923

@IsmaelMiguel, you're right - my friend who worked in IT retail said that LCD screens usually come with 1 or 2 cables (D-Sub/DVI), but TVs almost never do. – analytik – 2015-04-17T11:04:49.847

@analytik I've never seen a LCD coming with a DVI cable. But it would be great, since they are damn expencive! What I see a lot is DVD players with SCART or Component cables, and sometimes HDMI. – Ismael Miguel – 2015-04-17T11:29:00.240

12

Assuming brightness,contract and sharpness are the same in both cases, there could be 2 other reasons why text is sharper with DVI/HDMI:

The first has already been stated, VGA is analog so will need to go through an analog to digital conversion inside the monitor, this will theoretically degrade image quality.

Secondly, assuming you are using Windows there is a technique called ClearType (developed by Microsoft) which improves the appearance of text by manipulating the sub pixels of an LCD monitor. VGA was developed with CRT monitors in mind and the notion of a sub pixel is not the same. Because of the requirement for ClearType to use an LCD screen and the fact that the VGA standard doesn't tell the host the specifications of the display ClearType would be disabled with a VGA connection.

Source: I remember hearing about ClearType from one its creators on a podcast for This().Developers().Life() IIRC, but http://en.wikipedia.org/wiki/ClearType also supports my theory. Also HDMI is backward compatible with DVI and DVI supports Electronic Display Identification (EDID)

youngwt

Posted 2015-04-10T09:17:20.327

Reputation: 129

4"...and DVI supports Electronic Display Identification (EDID)" so does VGA. I couldn't find out whether there is a field in EDID which actually identifies the display type though; do you have this information? – Random832 – 2015-04-10T14:15:32.647

7

The others make some good points, but the main reason is an obvious clock and phase mismatch. The VGA is analog and is subject to interference and mismatch of the analog sending and receiving sides. Normally one would use a pattern like this:

http://www.lagom.nl/lcd-test/clock_phase.php

And adjust the clock and phase of the monitor to get the best match and the sharpest picture. However, since it is analog, these adjustments may shift over time, and thus you ideally should just use a digital signal.

Jarrod Christman

Posted 2015-04-10T09:17:20.327

Reputation: 216

Nice link!! :-) – Peque – 2015-05-04T14:20:47.793

5

There are a few answers indicating a digital signal vs. analog which is correct. But that does not answer the why? A few mentioned translation layers, this is sorta true too, a mainstream A/D conversion can cause a loss in fidelity, but you'd have to measure this as it is hard to see the differences with the naked eye. A cheap conversion and all bets are off.

So why is digital better than analog?

An analog RGB signal (such as VGA) uses the amplitude of the signal (.7 Volts peak to peak in the case of VGA). This like all signals has noise which if large enough will cause the levels to be incorrectly translated.

Reflections in the cable (impedance mismatches) are actually the biggest downfall of an analog video signal. This introduces additional noise and gets worse with longer cables (or cheaper ones), the higher the resolution of the video also increases the signal to noise ratio. Interestingly, you should not be able to see any difference in a 800x600 signal unless the VGA cable is too long.

How does a digital signal avoid those pitfalls? Well for one the level is no longer relevant. Also DVI-D/HDMI uses a differential signal as well as error correction to assure the ones and zeros are faithfully transmitted correctly. There's also additional conditioning added to a digital signal that is not practical adding to an analog video signal.

Sorry for the soap box guys, but thems the facts.

Zameru

Posted 2015-04-10T09:17:20.327

Reputation: 51

3

Another issue is a lot of VGA cables are junk. If the VGA cable is less than 1/4" thick, you will probably notice ghosting on larger monitors (Higher the rez, more likely ghosting). I've even noticed ghosting on the attached VGA cable on some 19" CRT montitors. My VGA cables are about 1/3" thick and it really helps with the sharpness (Thicker wires, more shielding)

matt

Posted 2015-04-10T09:17:20.327

Reputation: 31

1You can't really tell anything for sure from the diameter of the cable. Heavier wires and more shielding could well be bigger, or bigger could mean just thicker insulation, or smaller could be better by using thinner but effective shielding and thinner insulation. – fixer1234 – 2015-04-11T01:45:40.003

2

HDMI and DVI are very similar. Both are digital; the main differences are that HDMI supports audio and HDCP. When switching between HDMI and DVI, there is little actual conversion, but rather connecting the matching pins on the connectors. Display Port is a digital protocol, but VGA is analog, and the converter likely introduces small flaws and the reduction in sharpness into the signal. The monitor probably converts the signal back to digital, incurring another conversion that adversely affects the quality.

Over all, computer->HDMI->DVI->monitor has fewer actual conversions than computer->DisplayPort->VGA->monitor and never converts the signal to analog, giving a sharper image.

Josh The Geek

Posted 2015-04-10T09:17:20.327

Reputation: 223

You can have HDCP with DVI, too. – Carsten S – 2015-04-12T12:27:15.513

1

HDMI and DVI are actually one and the same (at least in your context). DVI connectors are really just HDMI connectors with a different pinout (the rows of pins), plus VGA connectors with a different pinout (the pins arranged around a cross shape). If you look at your HDMI-to-DVI converter, you'll probably notice that the cross-shaped part is missing.

So you are comparing an HDMI image with a VGA image, and, as @Josh pointed out, one that involves an actual conversion.

There actually is a difference between HDMI and DVI, but it's not relevant to you. HDMI can carry additional signals, including the encryption signal for content protection, sound, Ethernet, etc.

Kevin Keane

Posted 2015-04-10T09:17:20.327

Reputation: 419