Why image is blurrier in Windows than in Linux (Ubuntu)?

0

While viewing an image on Facebook using Google Chrome, I was very surprised to notice (indeed, I noticed it while viewing it) that it better (sharper) on Linux than on Windows, which you can see below.

Is this a well-known issue? How and why it happening?

(I'm hoping for a somewhat more elaborate/informational answer than "because they use different interpolation algorithms"... e.g. the obvious points to touch on would be "what", "why", "how", etc.)

Here is a smaller Windows example, and here is a smaller Linux example.

Linux: Image in Windows

Windows: Image in Linux

Original: Original image

user541686

Posted 2016-01-31T07:36:16.033

Reputation: 21 330

I just opened both the Windows one and the Linux one in two tabs and flicking between them, they look completely identical on my machine. If you look at your screenshots in the same OS do they look different to you? – Jonno – 2016-01-31T07:40:41.960

@Jonno: Yes they do. Try zooming in a lot and comparing smaller portions, such as this (Windows) against this (Linux).

– user541686 – 2016-01-31T07:43:59.740

Yes, more obvious in those ones. Afraid I don't know why (Graphics driver differences perhaps?) but might be worth adding those images to your question as well – Jonno – 2016-01-31T07:45:29.730

Because, OSS... that's why. – krowe – 2016-01-31T08:17:10.680

There's a slight difference in settings. The Windows image is slightly sharper because the contrast is slightly higher. Could be the screen drivers or the software used on each OS for rendering. The last (original) image is slightly larger than the screenshots. If there's any interpolation going on (either in fitting the image to the window or capturing the screen for these images), that could affect them differently on each OS. The what, why, and how are differences in default settings in the software, which result in the rendering differences. – fixer1234 – 2016-01-31T08:19:31.737

@fixer1234: "There's a slight difference in settings." ... "Could be the screen drivers or the software used on each OS for rendering." ... "The what, why, and how are differences in default settings in the software, which result in the rendering differences."... well, obviously, given that the hardware didn't change across the images, I think we've already figured out that it must be caused by a difference in the software... I'm not sure I understand where you're going. – user541686 – 2016-01-31T11:25:19.013

It might also be worth mentioning this is not a contrast issue. It really is a sharpness issue, and it's easier to tell on some images than others, or with some image sizes than others. I'm just not sure what the precise cause is. (Yes as I've mentioned in the question I can also figure that it's probably the interpolation algorithm that's different, but I'd like a more specific answer than that... see the question.) – user541686 – 2016-01-31T11:34:08.410

I looked at the pixel level at a number of areas containing some recognizable detail (like structures and fireworks trails rather than waves). What I saw was that the Windows images were slightly sharper than the Linux images, not the other way around (if Linux looks sharper to you, the appearance may be changing going from your screen to the screenshot images you posted). Sharpness is the ability to distinguish edges. In this case, the edges are more apparent because the Windows images have slightly more contrast between the colors on each side of the edges. (cont'd) – fixer1234 – 2016-01-31T17:37:18.887

This appears to be a contrast setting rather than a sharpening setting because sharpening tends to cause a halo effect, where contrast is increased mostly in the immediate area of the edges. Different video drivers and image software have slightly different default settings for things like gamma, contrast, and other characteristics that affect appearance. If there is any size adjustment involved, the most noticeable effect of the interpolation used to accomplish that is degradation of edges and small details, which translates to loss of sharpness. (cont'd) – fixer1234 – 2016-01-31T17:38:22.127

My point was that at a reasonable level of explanation, the differences are due to rendering settings (the images are being displayed with different adjustments), and possibly interpolation. It isn't clear where you want to go beyond that with your "what, why, and how" question. Do you want to know how your brain interprets visual information? Do you want to get into details of how the rendering engines work? – fixer1234 – 2016-01-31T17:38:50.137

@fixer1234: Whoops, I'd mixed up the labels apparently (noticed it by the font below) -- fixed those, thanks for pointing this out. In terms of sharpness vs. contrast -- aren't halos only an issue when you're explicitly sharpening (e.g. deconvolving)? In an interpolation context, I don't really expect halos. Regarding the details, what I'm trying to understand is e.g.: is this actually a driver thing or an OS thing? What are the two interpolation algorithms involved (at the very least their names)? In what way are they different (maybe pros/cons)? etc. – user541686 – 2016-01-31T18:31:46.150

Let us continue this discussion in chat.

– fixer1234 – 2016-01-31T19:00:38.973

No answers