11
2
I've tried a lot of things but I can't seem to get it working correctly. Below is my setup.
Laptop is a Dell Latitude E6535, which has Video chip/card NVidia NVS 5200M.
I have two external AOC IPS i2367Fh 23-Inch Screen LED Monitor. One connected to the laptop through HDMI and another through VGA.
Problem is, the text and overall image looks perfect and sharp on whichever of the two I connect through HDMI, while the VGA one just does not seem right, text does not look crisp, and just blurry enough that you cannot tell it's blurry, but you feel there's something wrong about it. I asked the wife to look at both monitors without telling her there was an issue, and she said the same thing.
Now, I have made sure it is not the cable because like I said, if I connect the monitor that looks odd through HDMI instead of VGA (i switch which monitor uses which cable), then it looks fine and the other looks bad. I've also tried two VGA cables.
When I go to the NVDIA settings, only one of the monitors can use the NVS 5200M chip, while the other one takes the Intel HD Graphics 4000 adapter, and I think that depends on which one I make my main display, but making the VGA monitor my main display (even though it would use the NVS 5200M) does not fix the issue. It will still look bad.
The resolution that I'm using is the native one for the monitors, 1920 x 1080.
I already tried tuning ClearType but did not fix it either.
Any ideas are welcome. Thanks
EDIT:
Thank you everyone for the responses/suggestions. It seems in order to get it working I will need either a docking station or an adapter. I'm considering the following:
LAST UPDATE:
The USB to HDMI adapter did NOT work for me at all. I just went with the docking station and DP to HDMI adapter and it worked flawlessly. Crisp image and text on both external monitors.
4You're suffering from seeing the digital and VGA displays side-by-side together. I've never seen that where the VGA monitor didn't look at least slightly inferior by comparison. Getting a better VGA cable isn't going to help. All of the posters here pointing out that HDMI is digital/VGA is analog are 100% correct. The digital display will be crisper. The USB docking station idea is a good one, especially if you can use USB 3.0. Or you could get a USB video adapter (can I say DisplayLink here?), but a docking station with DisplayLink technology might give you more utility. – Craig – 2014-03-25T12:42:44.917
1@Craig I was using a pair of NEC 2090's (high end 1600x1200 LCD) side by side for almost 2 years before discovering that my computer/monitor had decided to use the analog link in the DVI-I cable for one of them instead of the digital link. NEC's top line of monitors have a very large price premium to cover using top quality parts everywhere. Even knowing one 2090 was on analog I couldn't see any quality difference between the two. For most consumer grade monitors the difference is much more visible. I have immediately noticed cases where 1 of 2 dell 1280x1024 monitors was analog. – Dan is Fiddling by Firelight – 2014-03-25T13:13:22.697
1@DanNeely, you may have a good point, there. Where I've been seeing this most lately is in "consumer" grade 1080p LCD/LED monitors (e.g., not $800 IPS panels). Some of these displays are beautiful, actually, but when you feed two identical ones side by side with VGA and digital, the difference is hard to miss. Unplug the digital one, and the VGA display looks perfectly fine all by itself. :) – Craig – 2014-03-25T15:49:38.270
@DanNeely, also, are you really sure that monitor was using the analog lines in the DVI-I cable? Most of the monitors with DVI ports only have DVI-D ports on them and you can't even plug a DVI-I cable into them (you can plug male DVI-D into female DVI-I, but you physically cannot plug male DVI-I into female DVI-D). The analog lines are in those cables to let you use a DVI to VGA adapter and feed an analog port from a video card that puts an analog signal out the same port as the digital signal... – Craig – 2014-03-25T15:54:18.203
Another thing you may notice is that the VGA connection will look much better with cleartype turned off. – JamesRyan – 2014-03-25T17:44:16.773
Have you tried to calibrate Clear Type settings in Windows Control Panel? – Nazar554 – 2014-03-25T20:24:54.470
@Craig yes. It was a DVI-I port on the monitor. And I discovered the problem when I noticed that one monitor didn't offer the same on screen menu as the second. For some unknown reason when I needed longer cables I bought DVI-I instead of the -D variant. The 2090 has one port of each type. http://www.necdisplay.com/p/desktop-monitors/lcd2090uxi-bk-1
– Dan is Fiddling by Firelight – 2014-03-25T20:59:38.313@JamesRyan I did try turning ClearType off, but it looked horrible. – silverCORE – 2014-03-25T21:09:35.350
@Nazar554 I did, and it improved a little, but it is still noticeable and somehow the text is just slightly annoyingly blurry on the edges that it gives me a headache after looking at the text for a couple of hours. – silverCORE – 2014-03-25T21:10:17.060
@DanNeely, I've accidentally done exactly the same thing before, ordering online. I've had a couple of 12 foot DVI-I cables sitting on a shelf here waiting for an opportunity to be useful for about a year and a half, now. Those are spendy monitors, though, not too surprising that they'd do a better job adjusting to different input signals, I guess. :-) – Craig – 2014-03-25T21:34:22.647
@silverCORE it might be worthwhile trying to use old-school anti-aliasing instead of ClearType on the VGA connection, and see how it looks. No promises. http://superuser.com/questions/367230/how-to-turn-off-cleartype-and-use-whole-pixel-anti-aliasing-in-windows-7
– Craig – 2014-03-25T21:41:04.890@Craig thanks for the tip, Craig. I will try that when I get home in a bit..I also ended up ordering the following adapter from Amazon, thank God for Prime. http://www.amazon.com/gp/product/B00BPEV1XK/ref=oh_details_o00_s00_i00?ie=UTF8&psc=1 If Antialiasing works correctly I can just return the adapter, if not, I hope the adapter works. The last option would be the dock. I'll update the question once I get a working solution.
– silverCORE – 2014-03-25T22:03:08.487At the risk of starting a flame war... It is possible to get a good image through VGA, bear in mind that whole chain is important, your gfx card DAC, connector, cable, connector, monitor electronics. It gets harder at high resolutions. Blurry image indicates poor driver (pc) or high capacitance (cable), thus shorter cable should help to a point. HDMI is not perfect either, difference is how errors manifest. VGA blurry or shaky image, HDMI flickering broken pixels, loss of sound, stop-and-go image. – Dima Tisnek – 2014-03-26T10:32:35.163
http://www.amazon.com/review/R3E22LM949S2RH/ref=cm_cr_pr_perm?ie=UTF8&ASIN=B009V8F700&linkCode=&nodeID=&tag= a user complains specifically about VGA input on this monitor, I don't know if this reviewer is trustworthy, but it is reasonable to expect that manufacturer sacrificed something to reach this price point. They could have decided that VGA is only useful for legacy applications, like analogue TV/DVD/VHS... – Dima Tisnek – 2014-03-26T10:39:06.993
Thank you everyone for the comments, ideas, suggestions, etc. – silverCORE – 2014-03-29T18:39:44.980
Cleartype uses subpixel antialiasing, it requires a digital connection to work because it is not possible to get 1 to 1 pixel mapping with the analogue signal. It will look jagged if you have turned off all antialiasing. – JamesRyan – 2014-04-08T11:17:27.947