Why does 1366x768 resolution exist?

150

18

I know that there's a previous question about this but it doesn't have any real answers despite having been viewed 12,400 times, and the fact that it's been closed. With that in mind...

Why in the world is 1366x768 resolution a real thing? It has an aspect ratio of 683:384, which is the weirdest thing I've ever heard of while living in a 16:9 world.

All screens and resolutions I've been familiar with have been 16:9 aspect ratio. My screen, 1920x1080, is 16:9. The 720p that I'm familiar with is 1280x720, also 16:9. 4K that I'm familiar with, 3840x2160, is also 16:9. Yet 1366x768 is 683:384, a seemingly wild break from the standard.

I know there are plenty of other resolutions all over the place, but 1366x768 seems to dominate most of the mid priced laptop world and also seems unique to the laptop world. Why don't laptops use 1280x720 or something else as a standard?

meed96

Posted 2015-07-27T22:14:01.843

Reputation: 2 076

Question was closed 2015-08-02T20:05:11.470

44Fyi, 4:3 was the standard ratio for TV and computer prior to HDTV overtaking it. – Andy – 2015-07-27T23:57:12.317

99Before asking this question, did you do the math to consider that 683:384 is ~16.008:9, so not such a "wild" break after all? Certainly not nearly as much so as 1280x800's 16:10. – Random832 – 2015-07-28T00:08:40.373

35@Random832 It's not like 16:10 is that weird. 1920x1200 is a completely standard resolution for many 20-24" monitors, especially in more professional settings. – SBI – 2015-07-28T07:22:50.007

13The integer ratio that you're fixated on isn't significant in any way. Expressed as a decimal, it's 1.77777... versus 1.77864583.... -- less than a millimeter difference on any desktop panel. – Russell Borogove – 2015-07-28T14:15:10.597

1I suppose you've never heard of 1024x600, with 128:75 ratio (or, approximately 5:3 if you will)? – Luke – 2015-08-01T18:10:53.913

Answers

186

According to Wikipedia (emphasis mine):

The basis for this otherwise odd seeming resolution is similar to that of other "wide" standards – the line scan (refresh) rate of the well-established "XGA" standard (1024x768 pixels, 4:3 aspect) extended to give square pixels on the increasingly popular 16:9 widescreen display ratio without having to effect major signalling changes other than a faster pixel clock, or manufacturing changes other than extending panel width by 1/3rd. As 768 does not divide exactly into 9, the aspect ratio is not quite 16:9 – this would require a horizontal width of 1365.33 pixels. However, at only 0.05%, the resulting error is insignificant.

Citations are not provided, but it is a reasonable explanation: it's the closest to 16:9 they could get by keeping 768 vertical resolution from 1024x768, which had been widely used for manufacturing of early 4:3 LCD displays. Maybe that helped reduce costs.

mtone

Posted 2015-07-27T22:14:01.843

Reputation: 11 230

27It also makes it easy to pillarbox 4:3 applications designed to run well at 1024x768. – Random832 – 2015-07-28T00:06:40.680

7How is this the closest they could get? 1365 is closer than 1366 to 1365.33. – Kaiserludi – 2015-07-28T16:01:35.113

56@Kaiserludi Odd numbers are really flaky to deal with. – chrylis -on strike- – 2015-07-28T16:15:40.433

15@Kaiserludi In this case, you would want to go slightly above, not slightly below. With 1365 pixels, you would have to cut off the left or right margin of a wide-screen movie, or you would have to scale it vertically to 767 pixels. – Kevin Keane – 2015-07-29T18:00:03.367

1@chrylis: Quite the opposite. Memory arrays come in multiples of mebibits. 1366768 images are a little too big to fit. A 1365768 resolution would be far easier to deal with. – Marcks Thomas – 2015-07-31T10:21:38.513

2@MarcksThomas That's neglecting every single other aspect besides the frame buffer, which for a 24-bit depth would round up anyway. – chrylis -on strike- – 2015-07-31T21:20:08.980

What is meant by major signalling changes? – André Chalella – 2015-11-29T21:31:04.897

@MarcksThomas that's why many manufacturers use 1360x768 instead of 1366x768, because it fits in 1MB of memory

– phuclv – 2018-07-09T08:12:20.913

65

At the time the first computer wide screens became popular, the usual resolution on 4:3 panels was 1024x768 (XGA display standard). So, for simplicity and backward compatibility, the XGA resolution was kept as a basis to make the WXGA resolution, so that XGA graphics could be displayed easily on WXGA screens. Just extending the width and keeping the same height was also more simple technically because you would only have to tweak the horizontal refresh rate timing to achieve it. However, the standard aspect ratio for wide display was 16/9, which isn't possible with 768 pixels on width, so the nearest value was choosen, 1366x768.

WXGA can also refer to a 1360x768 resolution (and some others less common), which was made to reduce costs in integrated circuits. 1366x768 8bit pixels would take just above 1MiB to be stored (1024.5KiB) so that wouldn't fit in 8Mbit memory chip, you would have to take a 16Mbit one just to store a few pixels. That's why something a bit lower that 1366 was taken. Why 1360? Because you can divide it by 8 (or even 16) which is way more simple to handle when processing graphics and could bring to optimized algorithms.

piernov

Posted 2015-07-27T22:14:01.843

Reputation: 1 796

Technically wasn't asking about 1360x786, but I'm familiar with it's existence and this would make sense why both extremely similar resolutions exist. – meed96 – 2015-07-27T23:11:02.187

5WXGA generally used 24 bits color (stored in 32 bits) so you'd need a 64 Mbit chip instead of a 32 bit chip, but the logic still applies. – MSalters – 2015-07-28T10:04:34.810

23

I had the same question in the 2007, because my computer doesn't supported my default tv resolution 1366x768 and I found this:

WHY does 1366 x 768 exist?

This has to do with a 1 megapixel processing boundary of easily available chipsets for VRAM ( video memory ) and video processing display drivers. Its a standard memory size of importance to chip makers. Makes for cost productive configurations where the Input / Output systems are built off of already available OEM devices, so basically the Manufacturer is more in the business of flatpanel Glass making and bezel/speaker situations on a large display. The functional basic math:

1 megapixel

1024 x 1024 = 1048576 pixels

1366 x 768 = 1049088 pixels 16 by 9 image

720p = 1280 x 720 = 921600 pixels. 16 by 9 HD standard .

720p is just under 1 megapixel of data per screen.

If they really wanted to make a 720p specific display, it would be 1280 x 720 pixels, but they decided to get every last bit they could into the viewable pixel space and that is what makes for 16 by 9 numbers to become 1366 across and 768 vertically. In fact 768 is a common vertical resolution memory boundary. Why get more pixels up into the glass and use 1366 x 768? ... because more pixels is better image resolution.

I recommend to read the full article here:

http://hd1080i.blogspot.com.ar/2006/12/1080i-on-1366x768-resolution-problems.html

Facundo Pedrazzini

Posted 2015-07-27T22:14:01.843

Reputation: 331

1And once manufactured in enormous quantities, they became cheap, and reduced the price of the notebook/laptop. All about costs folks. – mckenzm – 2015-07-30T01:42:47.680

17

768 is the sum of two powers of 2 (512 + 256), and 1366 is 16/9 times 768, rounded up to the next integer. But only the person who selected that resolution can answer the "why". Some people just like powers of 2.

Also, 768 times 1366 is just over one Mebipixel (2^20), which is about 1.05 Megapixel (1.05 * 10^6). 768 times 1365 is just under that, so marketing probably came into play, too.

Glenn Randers-Pehrson

Posted 2015-07-27T22:14:01.843

Reputation: 477

2Extremely interesting correlation with the mebi-byte/pixel thing there, I've never heard of that prefix before. – meed96 – 2015-07-27T22:59:08.157

28It's not just that people "like" powers of two, but that it's very convenient to deal with powers of 2 in the computer world - since a power of 2 is just a bit shift away (or ann additional bit on the address bus, etc). – Johnny – 2015-07-27T23:46:07.123

6To paraphrase Johnny, powers of 2 correspond to "how many bits". If you use a number that's not a power of 2 then you either need fractional bits (silly of course) or you have hardware that's not fully usable. For example, to address 200 pixels you need 8 bits but that's a waste of the 8 bits because you can address 256 pixels with 8 bits. So some people just don't like wasting those bits and just move up to 256 pixels. – slebetman – 2015-07-28T08:50:12.783

5"Just over" a noce power of two is of course rather a disadvantage ... – Hagen von Eitzen – 2015-07-28T09:50:05.563

@Johnny Powers of two are convenient to work with. But in most of the graphics I have worked with it is in the horizontal resolution where it makes the most of a difference. That makes 1366 a very strange number since it doesn't divide evenly by powers of two. 1360 would have made a lot more sense and also avoided the drawback mentioned by Hagen. – kasperd – 2015-07-28T11:03:14.363