There are a few answers indicating a digital signal vs. analog which is correct. But that does not answer the why? A few mentioned translation layers, this is sorta true too, a mainstream A/D conversion can cause a loss in fidelity, but you'd have to measure this as it is hard to see the differences with the naked eye. A cheap conversion and all bets are off.
So why is digital better than analog?
An analog RGB signal (such as VGA) uses the amplitude of the signal (.7 Volts peak to peak in the case of VGA). This like all signals has noise which if large enough will cause the levels to be incorrectly translated.
Reflections in the cable (impedance mismatches) are actually the biggest downfall of an analog video signal. This introduces additional noise and gets worse with longer cables (or cheaper ones), the higher the resolution of the video also increases the signal to noise ratio. Interestingly, you should not be able to see any difference in a 800x600 signal unless the VGA cable is too long.
How does a digital signal avoid those pitfalls? Well for one the level is no longer relevant. Also DVI-D/HDMI uses a differential signal as well as error correction to assure the ones and zeros are faithfully transmitted correctly. There's also additional conditioning added to a digital signal that is not practical adding to an analog video signal.
Sorry for the soap box guys, but thems the facts.
6
You're probably playing with the wrong settings. Use a program (or web site) that can generate a test pattern and adjust the clock and phase settings.
– David Schwartz – 2015-04-10T09:52:02.93011
Keep in mind that both HDMI and DVI utilize the same signal/protocol - TMDS , so the first option is bit-perfect (lossless).
– None – 2015-04-10T16:06:20.777Obvious solution: get an HDMI monitor – Steven Penny – 2015-04-11T01:09:25.543
Most modern VGA outputs are lowish-quality. That’s because nobody cares anymore. Oh, and you did try the Auto button when using VGA, right? – Daniel B – 2015-04-12T00:07:30.050
To me, the VGA sample is, if anything, slightly sharper than the HDMI sample, so your whole point fails. There is of course a substantial difference in brightness between the samples. – kreemoweet – 2015-04-12T01:26:43.270
1@StevenPenny - How would that help? He already has a sharper image using an HDMI-DVI adapter. As the monitor already has a DVI input, why would a HDMI input be necessary? Please expand the reasoning behind your comment. – Greenonline – 2015-04-12T06:53:36.770