Two identical external monitors, one through HDMI another VGA. Text on VGA looks blurry

11

2

I've tried a lot of things but I can't seem to get it working correctly. Below is my setup.

Laptop is a Dell Latitude E6535, which has Video chip/card NVidia NVS 5200M.

I have two external AOC IPS i2367Fh 23-Inch Screen LED Monitor. One connected to the laptop through HDMI and another through VGA.

Problem is, the text and overall image looks perfect and sharp on whichever of the two I connect through HDMI, while the VGA one just does not seem right, text does not look crisp, and just blurry enough that you cannot tell it's blurry, but you feel there's something wrong about it. I asked the wife to look at both monitors without telling her there was an issue, and she said the same thing.

Now, I have made sure it is not the cable because like I said, if I connect the monitor that looks odd through HDMI instead of VGA (i switch which monitor uses which cable), then it looks fine and the other looks bad. I've also tried two VGA cables.

When I go to the NVDIA settings, only one of the monitors can use the NVS 5200M chip, while the other one takes the Intel HD Graphics 4000 adapter, and I think that depends on which one I make my main display, but making the VGA monitor my main display (even though it would use the NVS 5200M) does not fix the issue. It will still look bad.

The resolution that I'm using is the native one for the monitors, 1920 x 1080.

I already tried tuning ClearType but did not fix it either.

Any ideas are welcome. Thanks

EDIT:

Thank you everyone for the responses/suggestions. It seems in order to get it working I will need either a docking station or an adapter. I'm considering the following:

LAST UPDATE:

The USB to HDMI adapter did NOT work for me at all. I just went with the docking station and DP to HDMI adapter and it worked flawlessly. Crisp image and text on both external monitors.

silverCORE

Posted 2014-03-25T07:09:10.880

Reputation: 665

4You're suffering from seeing the digital and VGA displays side-by-side together. I've never seen that where the VGA monitor didn't look at least slightly inferior by comparison. Getting a better VGA cable isn't going to help. All of the posters here pointing out that HDMI is digital/VGA is analog are 100% correct. The digital display will be crisper. The USB docking station idea is a good one, especially if you can use USB 3.0. Or you could get a USB video adapter (can I say DisplayLink here?), but a docking station with DisplayLink technology might give you more utility. – Craig – 2014-03-25T12:42:44.917

1@Craig I was using a pair of NEC 2090's (high end 1600x1200 LCD) side by side for almost 2 years before discovering that my computer/monitor had decided to use the analog link in the DVI-I cable for one of them instead of the digital link. NEC's top line of monitors have a very large price premium to cover using top quality parts everywhere. Even knowing one 2090 was on analog I couldn't see any quality difference between the two. For most consumer grade monitors the difference is much more visible. I have immediately noticed cases where 1 of 2 dell 1280x1024 monitors was analog. – Dan is Fiddling by Firelight – 2014-03-25T13:13:22.697

1@DanNeely, you may have a good point, there. Where I've been seeing this most lately is in "consumer" grade 1080p LCD/LED monitors (e.g., not $800 IPS panels). Some of these displays are beautiful, actually, but when you feed two identical ones side by side with VGA and digital, the difference is hard to miss. Unplug the digital one, and the VGA display looks perfectly fine all by itself. :) – Craig – 2014-03-25T15:49:38.270

@DanNeely, also, are you really sure that monitor was using the analog lines in the DVI-I cable? Most of the monitors with DVI ports only have DVI-D ports on them and you can't even plug a DVI-I cable into them (you can plug male DVI-D into female DVI-I, but you physically cannot plug male DVI-I into female DVI-D). The analog lines are in those cables to let you use a DVI to VGA adapter and feed an analog port from a video card that puts an analog signal out the same port as the digital signal... – Craig – 2014-03-25T15:54:18.203

Another thing you may notice is that the VGA connection will look much better with cleartype turned off. – JamesRyan – 2014-03-25T17:44:16.773

Have you tried to calibrate Clear Type settings in Windows Control Panel? – Nazar554 – 2014-03-25T20:24:54.470

@Craig yes. It was a DVI-I port on the monitor. And I discovered the problem when I noticed that one monitor didn't offer the same on screen menu as the second. For some unknown reason when I needed longer cables I bought DVI-I instead of the -D variant. The 2090 has one port of each type. http://www.necdisplay.com/p/desktop-monitors/lcd2090uxi-bk-1

– Dan is Fiddling by Firelight – 2014-03-25T20:59:38.313

@JamesRyan I did try turning ClearType off, but it looked horrible. – silverCORE – 2014-03-25T21:09:35.350

@Nazar554 I did, and it improved a little, but it is still noticeable and somehow the text is just slightly annoyingly blurry on the edges that it gives me a headache after looking at the text for a couple of hours. – silverCORE – 2014-03-25T21:10:17.060

@DanNeely, I've accidentally done exactly the same thing before, ordering online. I've had a couple of 12 foot DVI-I cables sitting on a shelf here waiting for an opportunity to be useful for about a year and a half, now. Those are spendy monitors, though, not too surprising that they'd do a better job adjusting to different input signals, I guess. :-) – Craig – 2014-03-25T21:34:22.647

@silverCORE it might be worthwhile trying to use old-school anti-aliasing instead of ClearType on the VGA connection, and see how it looks. No promises. http://superuser.com/questions/367230/how-to-turn-off-cleartype-and-use-whole-pixel-anti-aliasing-in-windows-7

– Craig – 2014-03-25T21:41:04.890

@Craig thanks for the tip, Craig. I will try that when I get home in a bit..I also ended up ordering the following adapter from Amazon, thank God for Prime. http://www.amazon.com/gp/product/B00BPEV1XK/ref=oh_details_o00_s00_i00?ie=UTF8&psc=1 If Antialiasing works correctly I can just return the adapter, if not, I hope the adapter works. The last option would be the dock. I'll update the question once I get a working solution.

– silverCORE – 2014-03-25T22:03:08.487

At the risk of starting a flame war... It is possible to get a good image through VGA, bear in mind that whole chain is important, your gfx card DAC, connector, cable, connector, monitor electronics. It gets harder at high resolutions. Blurry image indicates poor driver (pc) or high capacitance (cable), thus shorter cable should help to a point. HDMI is not perfect either, difference is how errors manifest. VGA blurry or shaky image, HDMI flickering broken pixels, loss of sound, stop-and-go image. – Dima Tisnek – 2014-03-26T10:32:35.163

http://www.amazon.com/review/R3E22LM949S2RH/ref=cm_cr_pr_perm?ie=UTF8&ASIN=B009V8F700&linkCode=&nodeID=&tag= a user complains specifically about VGA input on this monitor, I don't know if this reviewer is trustworthy, but it is reasonable to expect that manufacturer sacrificed something to reach this price point. They could have decided that VGA is only useful for legacy applications, like analogue TV/DVD/VHS... – Dima Tisnek – 2014-03-26T10:39:06.993

Thank you everyone for the comments, ideas, suggestions, etc. – silverCORE – 2014-03-29T18:39:44.980

Cleartype uses subpixel antialiasing, it requires a digital connection to work because it is not possible to get 1 to 1 pixel mapping with the analogue signal. It will look jagged if you have turned off all antialiasing. – JamesRyan – 2014-04-08T11:17:27.947

Answers

14

Basically, HDMI is digital and VGA is analogue.

There are a few solutions:

1) Buy a docking station which provides access to 2x HDMI or 1x HDMI + 1x DVI.

2) Use an on-board DVI instead of the VGA.

3) Buy a HDMI -> VGA converter. The 1st VGA will no longer seem blurry in comparison with the HDMI.

JB_STWUK

Posted 2014-03-25T07:09:10.880

Reputation: 190

3The last solution is absolutely brilliant :) – gronostaj – 2015-07-27T12:31:13.320

it just sounds silly that if someone doesn't have a laptop with HDMI that they will be stuck with bad looking text. How good are the adapters? I only have VGA and HDMI, no DVI on my work laptop. I was told maybe a USB to HDMI. Are those good enough? – silverCORE – 2014-03-25T21:17:28.103

@silverCORE The USB to DVI and USB to HDMI adapters are totally digital. The quality is as good as a straight DVI/HDMI connection. DVI and HDMI are the same thing, electrically. Just get whichever one works for the hardware you have, and you can buy simple passive adapters (cheap) if you need to. If you have USB 3.0 you could play fairly high frame-rate games through one of these adapters. USB 2.0 is obviously slower, but for anything that isn't really video intensive USB 2.0 adapters work just fine. – Craig – 2014-03-25T21:39:30.957

24

VGA is analog. HDMI is digital. Meaning: the digital output of your computer is converted to the analog VGA signal. The analog VGA signal is converted back to a digital signal by your monitor. These conversions depend on the quality of the involved cable, connectors and especially the analog/digital converter components within your graphics card and the monitor. It can be very good, but never perfect. Some data is always lost/changed. At low resolution, this difference is not notable. At higher resolutions, it is. And with the analog use case being not very common nowadays, you can expect vendors to use cheaper/worse A/D converters for current hardware, not better ones. See also: http://www.brighthub.com/computing/hardware/articles/23769.aspx

Matthias

Posted 2014-03-25T07:09:10.880

Reputation: 341

I see. It's really weird though, I don't think I had ever run into a monitor that I couldn't get to display nicely. Granted, I haven't worked with hundreds of them, but a least a dozen or so, with different workstations. It's also a little strange that the text on my external monitor at work looks decent enough, with the same laptop, although this monitor uses a lower resolution, 1440 x 900 – silverCORE – 2014-03-25T21:13:24.090

1I have also encountered issues with two identical monitors side by side, once using VGA, one using DVI. The VGA connected one was ever so slightly fuzzy by comparison, but I only noticed because they were side-by-side. Cable quality definitely contributes. – Michael12345 – 2014-03-26T03:34:57.520

16

Each and every VGA flat-panel display has an "Auto" button. It automatically adjusts the interpretation of the analog signal to achieve a (more or less) pixel-perfect mapping.

Activate this function when the outer edges of the image displayed are clearly defined (nothing black) and you have text visible.

Still, 1080p is in the upper regions of what's possible (at all) with VGA and many devices nowadays have lowish-quality VGA output.

Daniel B

Posted 2014-03-25T07:09:10.880

Reputation: 40 502

5Thanks for the "Auto" tip, I activated this and the monitor went from completely intolerable to at least bearable! – Kik – 2015-11-30T15:17:08.000

Virtually every time I've seen VGA and digital (DVI/HDMI/DP) displays side-by-side on the same system, the VGA side looks inferior, sometimes almost imperceptibly, but inferior nonetheless. – Craig – 2014-03-25T12:35:53.497

@Craig Yeah, that depends a lot on the graphics card and the display. A Full-HD CRT display can still look much better over VGA than a typical LCD over HDMI (different sub-pixel layout, proper color gamut, no need for gamma (re)correction...). Using VGA with LCDs is silly - it's a last-chance fallback, not a worthy interface :D – Luaan – 2014-03-25T13:13:05.110

High quality CRT's did do a nice job of smoothing those pixels. R.I.P., LCD... :-) The things I don't miss about big CRT's are the space they take, the weight and the 140 Watt power draw. :-) – Craig – 2014-03-25T16:00:33.653

Hi Daniel. It is pretty weird. My monitors do have an Auto button, but when I click it it does something else, not an auto adjustment. The monitors also came with a program that can do some monitor configurations on-screen vs physically pushing the buttons, but the Auto seems to be unavailable for some reason. – silverCORE – 2014-03-25T21:07:50.340

Could you elaborate on what "something else" is? :) – Daniel B – 2014-03-25T21:49:26.260

7

As others have said, VGA is an analog signal but the pixels in a flat panel display are digital. The monitor has to know where in the VGA analog waveform to sample the signal to convert it to digital. The auto adjust feature on flat panel displays attempts to guess the best timing for that sampling. But if the timing isn't perfect, you will get a blurry image.

To give your monitor the best chance for the auto adjust to pick the correct timing values, you need to display an image with lots of high-contrast transitions. My go-to for this type of image is a single-pixel black/white checkerboard. And this is the place I always go to get that checkerboard: http://techmind.org/lcd/phasing.html

tl;dr: Go to http://techmind.org/lcd/phasing.html, maximize your browser window, and press your monitor's auto adjust button.

longneck

Posted 2014-03-25T07:09:10.880

Reputation: 372

I went to the phasing web page and it flickers while I move the browser window. My screen is an internal laptop display connected via LVDS – Suici Doga – 2017-09-19T14:23:25.807

So you have a completely different problem from this question? Got it. Ask your own question. – longneck – 2017-09-19T14:25:42.333

1+1 for the link. I had only auto-adjusted my LCD connected by VGA using whatever windows I had open at the time. When I went to that link, it was dancing and shimmering all over the place. Activating the auto-adjust feature resulted in a nearly perfect checkerboard, and text especially in my terminals looks better (although obviously still not quite as good as the adjacent, same model of LCD connected by HDMI). Also - the LVDS "problem" does not seem like a problem; I see the same, probably just due to interpolation/animation while moving & not worth worrying about unless it happens when still – underscore_d – 2018-10-28T20:33:12.300

1+1 Works like a charm! – mrbm – 2019-03-14T16:49:27.737

3

I suspect the lower quality you see is due to the analogue nature of the VGA signal. The higher the bandwidth you use (higher resolutions), the worse it becomes.

I see one reliable but not exactly cheap solution: A docking station. I checked briefly, the one for the 6540 comes with dual DP and dual Dual-link DVIs, make sure you have sufficient ports before you buy. Then, with both screens connected digitally, you should see crisp text on both screens. You might still see a difference in colour though.

TheUser1024

Posted 2014-03-25T07:09:10.880

Reputation: 2 823

3This is what I use and works like a charm. You can get DP (Display Port) to HDMI adapters too if you want to go all HDMI. – Bradley Forney – 2014-03-25T12:41:13.823

@TheUser1024 unfortunately it seems like it is what I will have to go for. To be honest I was already considering it, to avoid having to plug 2 monitors and a keyboard when I got home from work, I guess I just hate the idea of the dock not being useful to me whenever I get a new computer from work. It's also $130 =S – silverCORE – 2014-03-25T21:14:58.960

@silverCORE: Have you asked your employer to buy it? Or is it not for doing work at home? It might aslo be worth asking (if they won't buy it) if you can buy it through your employer. They might be getting a worthwhile discount. – TheUser1024 – 2014-03-25T21:23:36.177

1

I think the following may be of help for you.

I've actually had the same problem myself (I obviously can't tell if the root cause is the same but the symptoms were very similar) and here are my specs, my findings and my trick to workaround it :

Spec : I'm using 2x identical Samsung monitors, a custom built desktop unit and a lenovo laptop (no docking station). The lenovo laptop has a minidisplay port and a vga port. Both of the computers go through a dual screen video switch, which allows me to switch between both computers while still using both monitors on each.

Symptoms : When I switch to the laptop, the monitor plugged through VGA (VGA -> Switch -> VGA) gets a slight blur, which is pretty aggrivating when you're typing, reading, etc. The problem never happens to the other monitor AND, more importantly, also doesn't happen to that same monitor when I switch to my desktop, which sends its output through DVI (DVI -> Switch -> VGA), thus isolating the problem on the laptop side (the monitor, switch and pretty much cable are fine)

Findings : Initially, I realized that when switching to the laptop, Windows discovered the monitors differently (one had a factory name, and one just was recognized as a Generic monitor). I saw that along with those discovery settings, the frequency was different (60Hz on the correct output, 59 on the other one). So I set the second one to 60Hz and that seemed to fix the problem temporarily. A few days later, the problem showed up again and this time the frequency was still correct, sitting at 60Hz. So I ended up getting a correct picture again by doing a new discovery run by clicking on "Detect" (when both screens go black for a second and then the display comes back).

Current trick : This is a very weird bug, possibly in the display output/drivers/hardware etc of the laptop, which I can't control anyways cause it's a company laptop. All I know is, a straight up dicovery run doesn't always thoroughly discover, and so to do the trick, I unplugg the VGA output, hit "Detect" again, and then plugg the VGA cable again. That seems to force the "deep" discovery and fix my problem temporarily.

Obviously, if anyone can pinpoint the root cause and fix the problem long term, please write something here! All I know is, all of the answers above are pure speculation and hence, largely irrelevant.

nairod

Posted 2014-03-25T07:09:10.880

Reputation: 11

"all of the answers above are pure speculation and hence, largely irrelevant." Seriously?? You think the other answers, which use simple logic and are trivially demonstrable by (a) understanding the basic theory and (b) checking 2 of the same monitor side-by-side with analogue vs. digital connections... are "pure speculation" and "largely irrelevant", by which you presumably mean less likely to be accurate for the OP & other future readers, when compared to... your anecdote based on a weird problem with your OS/hardware combination, which probably won't happen for many other users? Hilarious – underscore_d – 2018-10-28T20:39:56.967