4k Monitor. Higher PPI than iMac 27". More Pixelated than iMac

4

0

I'm looking to understand exactly why a monitor with higher number of pixels per inch (albeit slightly so), than the standard 27" iMac, cannot produce image that is at least just as sharp as images on the iMac.

The monitor in question is the Philips Brilliance 4K Display BDM4065UC/00. Being a 39.56" UHD screen, it runs at a native resolution of 3840 x 2160, resulting in a PPI around 111.

Compare that to the standard 27" Apple Display/iMac, which at a native resolution of 2560 x 1440, has a PPI of 109.

I understand these two displays are built by different manufactures and use different technologies. However, as someone who doesn't understand much about the internals of displays, I'm struggling to understand why the quality and crispness of the image on my 40" Philips monitor cannot match that of the iMac's, despite the higher PPI.

And by quality of image, I'm not limiting this to text-rendering alone. OS X does allow changing the level of font smoothing by overwriting the AppleFontSmoothing defaults, but no amount of fiddling with the values can resolve the issue, and besides, the issue isn't just about text anyway (e.g. it could be the image of a text). It simply appears more pixelated, everywhere.

I'm using a Mini-DP to Mini-DP cable, running at full resolution, and 60Hz frequency.

Can anyone shed some light please? Can anything be done about it?

More Info

I've found out that when connected using HDMI instead of MiniDP, the rendering is improved. Unfortunately it only runs at 30Hz on HDMI, so that's not a solution. It may not be easy to see the difference when paying attention to the pixels, but the subtle difference makes a huge difference in readability during normal use. Open the pictures in full size using two different tabs, and flick between them, you should be able to see the difference. Can you see the hues around the edges of text? I think that's what's messing with my eyes.

I guess that's the subpixel antialiasing not working so well with MiniDP. I set AppleFontSmoothing to 0 to turn off subpixel antialiasing. Not much of an improvement.

I've tried 3 different DP cables, all with the same result. Any more ideas and suggestions?

HDMI - AppleFontSmoothing: 1 HDMI - AppleFontSmoothing: 1

MiniDP - AppleFontSmoothing: 1 MiniDP - AppleFontSmoothing: 1

Enlarged Screenshot - AppleFontSmoothing: 1 Enlarged Screenshot - AppleFontSmoothing: 1

MiniDP - AppleFontSmoothing: 0 MiniDP - AppleFontSmoothing: 0

Merott

Posted 2015-10-23T17:58:12.923

Reputation: 141

1Just checking: you've connected the monitor to a Mac running OS X? May be handy to tag your question with the OS in question, as resolution and DPI configuration issues (and solutions) will vary by OS. – CBHacking – 2015-10-23T18:26:21.517

Yes, my primary machine is a Retina MacBook Pro 2015, and that's the only one I'm concerned about, although I've tried connecting to a Windows PC as well, and the result still isn't as impressive as I expect it would have been if connected to an Apple Display, which is practically the same as the iMac. – Merott – 2015-10-23T18:45:30.340

Not sure what image you are referring to with regards to quality but Macs have various "scaling" options, some of which use the native resolution but will basically pretend they are a lower resolution and upscale it to native. What this means is that any raster imagery will be stretched (or resampled) to up to 2x its original pixel size. If you set the screen to 1920x1080 and then set it to native and everything is the same size realative to the physical size fo the monitor frame, then tou are upscaling. Upscaling= quality loss. – Yorik – 2015-10-23T20:38:58.340

For all practical purposes, 109 vs. 111 is the same resolution. Any differences in appearance relate to how the material is rendered. – fixer1234 – 2015-10-23T20:57:29.277

Just added more info. – Merott – 2015-11-17T21:23:37.837

There's occational reports of issues with macs and third party 4k displays. I wonder if their scaling is somehow optimised for their own displays. – Journeyman Geek – 2015-11-18T00:38:08.287

FYI: If you want 60fps 4k, I think you need to be using a display port. – Bennett Yeo – 2015-11-18T01:49:11.190

Answers

1

Well, I don't have enough rep to comment, therefore I'll put this in an answer.

Looking at the monitor spec page, make sure you don't have any of the funky image modes on. I'm sure you knew that already, but it's good to check.

Also, like others have said, make sure OS X isn't doing any scaling. If you can't turn it completely off, there are tools out there to enable full resolution on the Retina displays. Compare them on full resolution, over MDP. MiniDP is going to be the only interface that can carry 4k@60 from the RMBP.

Perhaps the monitor's panel just doesn't like running at 60hz? It could be causing artifacts. Make sure there are no overdrive presets enabled. Or, if they're off, maybe enable them? Basically, poke at the monitor's settings.

Lastly, it might be a difference in panel type. Your new monitor is a VA panel, which isn't very color accurate, but your RMBP is an IPS display, which is more likely better with color accuracy.

Pixel Perfect

Posted 2015-10-23T17:58:12.923

Reputation: 123

The funky image modes make the colours terrible, although image quality is unaffected. There is no scaling at all, running at full 3840x2160 resolution. Just tried MiniDP 1.1, which runs at 30Hz only, and the image has slightly improved, although HDMI still looks better. I don't think it's to do with colour inaccuracy of the display to be honest, because again the HDMI image is better, and the difference is apparent when comparing pixel for pixel. Thank you for the thoughts. – Merott – 2015-11-18T00:12:08.303