Why is the display resolution not exactly 16:9 or 4:3?

16

4

Most displays get advertised with either 16:9 or 4:3 display ratio. However, if you compare the resolution with the display ratio, it's most often neither of both.

For example, the resolution of my notebook display is 1366x768.
But 1366/768 = 683/384 != 688/387 = 16/9
Another common resolution is 1920/1200 = 8/5

But for some resolutions it's correct:

  • 1024/768 = 4/3
  • 800/600 = 4/3

Is there a technical reason / user experience reason for this? Why do displays have other ratios than what they get advertised?

(I assume that every pixel is a perfect square. Is this assumption wrong?)

Martin Thoma

Posted 2013-01-06T09:52:33.993

Reputation: 2 705

I've already given some techniques in my answer – phuclv – 2018-07-09T08:09:19.343

@Thomas: never heard that. Can you substantiate ? Can you give an example ? Or maybe you mean a multiple of a power of two, which can somewhat help address computation (shifts instead of division). But not rendering algorithms themselves. – Harry Cover – 2018-07-09T09:05:25.157

1Are you sure that the pixels are square? You might actually have the exact ratio but not realise it... – Toby Speight – 2019-04-09T16:52:52.540

<irony>In love and in advertising lying is not only allowed, but expected...</irony>; – lexu – 2013-01-06T10:41:43.120

5I would say 1366 x 768 is close enough 16:9. To be 16:9 exactly, it would have to be 1365 1/3 x 768 or 1366 x 768 3/8. – Bavi_H – 2013-01-06T20:39:10.487

or simply 1360 x 765 which would be exactly 16:9 – Martin Thoma – 2013-01-07T09:45:18.900

2@moose Which is a horrible choice because the height is not an even number. This breaks a lot of applications, in particular graphics rendering, where many techniques, both software and hardware, implicitly require height in pixels to be a multiple of two. – Thomas – 2013-01-10T04:04:05.990

1@Thomas I didn't know that. This is the kind of answer I've expected to get with this question. Can you tell me an example for such a technique? – Martin Thoma – 2013-01-10T08:50:17.587

Answers

21

Not every display resolution has to be 16:9 or 4:3.

My laptop and my TV have the well known 16:9 ratio.
My regular display has 16:10, at least they are marketed as 16:10, however the image below has them as 8:5. The broken screen that still sits on top of the locker behind me has a resolution of 5:4.

The image below shows most of the standard resolutions that are available.

source

I actually like 16:10 more than 16:9 and would pay a fair amount more money to get one of these instead. This however is personal opinion but should exemplary show you why there are not only two but a lot more standards to choose from.
Why do I like it so much? Not all movies are 16:9, there are a lot of 4:3 shows out.
When playing games I like it more to have a bit more vertical space to place menus, HUDs etc.
This of course comes down to personal preference. Personal preference between individuals is different and so are displays.

Why are displays marketed as 16:9 if they are not?
If this is done knowingly, I'd call that a scam.

Baarn

Posted 2013-01-06T09:52:33.993

Reputation: 6 096

2this doesn't explain why the resolution is not exactly 16:9 or 4:3 – phuclv – 2014-08-09T19:32:50.533

1Great & comprehensive answer. Re: your comment "marketed as 16:10, however the image below has them as 8:5", those are, obviously, the same ratio. I reckon calling it 16:10 is market-speak to make it sound similar to (and comparable to) the common 16:9 aspect ratio. – yosh m – 2013-01-06T17:11:27.073

5-1: Using the picture of CRT phosphor dots to say "pixels aren't squares" is misleading. Pixels refers to the rectangular grid of picture elements in the computer video memory. This doesn't always line up with the color dots on the screen. A CRT monitor doesn't line up pixels to phosphor dots in any particular way, so on a CRT, a pixel is usually not the same as a trio of phosphor dots. An LCD screen is able to line up a pixel with a trio of color elements exactly (when you use the LCD's native resolution). Subpixel methods only work well on LCD screens because of this. ... – Bavi_H – 2013-01-07T00:07:25.947

... I drew some white pixels on a black background in Paint, then photographed them on a CRT at its best resolution, then an LCD at its best resolution: image. I don't know how to take macro pictures well, so the LCD color elements are blured together, but you should be able to get the idea. ...

– Bavi_H – 2013-01-07T00:08:05.323

... I believe Moose was asking if pixels are always square or if they are sometimes non-square rectangles. My understanding is that in modern computer resolutions, pixels are square. In very old computer resolutions (like CGA 320 x 200), the image fills a 4:3 monitor, so the pixels are sometimes non-square rectangles. I think some TV formats use non-square rectangular pixels (for example, see the footnotes in the resolution diagram). – Bavi_H – 2013-01-07T00:08:49.253

@Bavi_H He asked if they are perfect squares, concluding from the images I think they are not, just nearly squared. Even in your photos they are not squares (LCD). However I have no idea if the order of the color plays another effect here and makes them look like squares in the end. The video memory itself has no rectangular grid, as the logical ordering does in no way reflect its physical order or vice versa. – Baarn – 2013-01-07T08:00:56.443

Pixels are a conceptual layer different from the physical color dots or stripes on the screen. Think of pixels as an ideal rectangular grid of colors projected onto the physical dots or stripes on the screen. On an LCD screen, 1 pixel maps to 1 square of red, green, and blue stripes. Someone might look at your picture and think, On a CRT screen, 1 pixel maps to 1 triangle of red, green, and blue dots, but that's wrong. To answer Moose's question about square pixels, it isn't useful to look at the physical dots or stripes; Moose's question is about the ideal rectangular grid of colors. ... – Bavi_H – 2013-01-08T07:16:36.993

... When you want to find the aspect ratio of full-screen image, the calculation aspect ratio = (width in pixels) / (height in pixels) only works if the horizontal pixel-to-pixel distance is the same as the vertical pixel-to-pixel distance. For modern computer resolutions this is true. We usually just shorten this and say the pixels are square. ... – Bavi_H – 2013-01-08T07:16:54.813

... In some video formats, the horizontal pixel-to-pixel distance is different than the vertical pixel-to-pixel distance. For example, in 320 x 200 CGA, 320/200 = 1.6, but the image actually fills a 4:3 (1.333...) area. You can think of the pixels as tall rectangles in this case. – Bavi_H – 2013-01-08T07:17:13.787

@Bavi_H so what should I write instead? (Actually you can put an edit in if you like…) – Baarn – 2013-01-08T07:35:10.830

1I rolled back the revision that suggested the picture of the phosphor dots and LCD segments illustrated non-square pixels, and removed my downvote. – Bavi_H – 2013-01-10T03:05:12.787

14

The exact ratio can only be obtained if the denominator is divisible by the denominator of the aspect ratio you want. 768 isn't divisible by 9, so there won't be any 16:9 integer resolution with that height. So why wasn't 1360:765 chosen?

Because dimensions of display resolutions tend to be a power of 2 (or a multiple of a power of 2 that is as large as possible), possibly because powers of 2 work better for a binary computer

  • 2D image formats as well as video codecs process the images in blocks instead of pixel-by-pixel individually or line line-by-line. The block sizes are always powers of 2 like 8x8, 16x16 or less frequently 4x8, 8x16, 4x16 because they're easier to arrange in memory, and also more suitable for the CPU's SIMD unit... That's why you'll see blocky artifacts when viewing a low quality image or video file.
  • 3D graphics renderers often use a technique called mipmapping that involves using images with sizes that are powers of 2 of each other, to increase rendering speed and reduce aliasing artifacts. If you're interested, check out How does Mipmapping improve performance?

So regardless of the graphics type, using powers of 2 eases the job of the encoder/decoder and/or GPU/CPU. Images with a non-power-of-2 side length will always have the corresponding side rounded up to a power of 2 (which you'll see later on the example of 1920x1080) and you'll end up wasting some memory at the edges for storing those dummy pixels. Transforming those odd-sized images like that also introduces artifacts (which are sometimes unavoidable) due to the dummy values. For example rotating odd-sized JPEGs will introduce noise to the result

Rotations where the image is not a multiple of 8 or 16, which value depends upon the chroma subsampling, are not lossless. Rotating such an image causes the blocks to be recomputed which results in loss of quality.[17]

https://en.wikipedia.org/wiki/JPEG#Lossless_editing

See


Now obviously 1360:765 is precisely 16:9 as you said, but 765 isn't divisible by any power of 2, while 768 can be divisible by 256 (28), so 768 for the height is a better choice. Moreover using 768 as the height has the advantage of being able to display the old 1024x768 natively without scaling

768/(16/9) = 1365.333..., so if you round it down, you'll get a value that's closest to 16:9. However it's an odd value, so people round it up to 1366x768, which is quite close to 16:9. But again, 1366 is only divisible by 2 so some screen manufacturers use 1360x768 instead since 1360 is divisible by 16 which is much better. 1360/768 = 1.7708333... which approximates 16/9 to about 2 decimal places, and that's enough. 1360x768 also has the bonus that it fits nicely inside 1MB RAM (whereas 1366x768 doesn't). 1344x768, another less commonly used resolution, is also divisible by 16.

WXGA can also refer to a 1360×768 resolution (and some others that are less common), which was made to reduce costs in integrated circuits. 1366×768 8-bit pixels would take just above 1-MiB to be stored (1024.5KiB), so that would not fit into an 8-Mbit memory chip and you would have to have a 16-Mbit memory chip just to store a few pixels. That is why something a bit lower that 1366 was chosen. Why 1360? Because you can divide it by 8 (or even 16) which is far simpler to handle when processing graphics (and could bring to optimized algorithms).

Why Does the 1366×768 Screen Resolution Exist?

Many 12MP cameras have effective resolution of 4000x3000, and when shooting in 16:9, instead of using the resolution 4000x2250 which is exactly 16:9, they use 4000x2248 because 2248 is divisible by 8 (which is the common block size in many video codecs), and 2250 is divisible by 2.

Some Kodak cameras use 4000x2256 too, since 2256 is divisible by 16, and 4000/2256 still approximates 16/9 to about 2 decimal places. If shooting in 3:2 they'll use 4000x2664, not 4000x2667 or 4000x2666 which are closer to 3:2, for the same reason.

And this is true for other resolutions too. You won't find any image resolutions that are odd. Most will be at least divisible by 4 - or better, 8. The full HD resolution, 1920x1080, has a height not divisible by 16, so many codecs will round it up to 1920x1088 instead, with 8 dummy lines of pixels, then crop it down when displaying or after processing. But sometimes it's not cropped so you can see there are many 1920x1088 videos on the net. Some files are reported as 1080 but actually 1088 inside.

You may also find the option to crop 1088 to 1080 in various video decoder's settings.


Back to your example 1920/1200 = 8/5, it's not strange at all because it's the common 16:10 aspect ratio that is close to the golden ratio. You can find it in 1280x800, 640x400, 2560x1600, 1440x900, 1680x1050... No one would advertised it as 16:9 because they're clearly 16:10

I assume that every pixel is a perfect square. Is this assumption wrong?

That's wrong. In the past pixels are often not a square but a rectangular shape. Other pixel arrangements like hexagon do exist although not very common. See Why are pixels square?

phuclv

Posted 2013-01-06T09:52:33.993

Reputation: 14 930

0

Yeah, it's to do with manufacturing.

We already made loads of 1024x768 panels, so why not just make them wider so they are 1366x768.

I'm not sure about the other one, I haven't come across panels with that resolution.

Rich Bradshaw

Posted 2013-01-06T09:52:33.993

Reputation: 6 324