What are the differences of watching a 1080p and a 720p version of the same media on a 720p screen?

86

22

Say I have 1280x720 pixels on my screen and the current resolution is set to 1280x720 as well. What are the differences between the 1080p version and the 720p version of the same media? Is it noticeable to an end user (a non-video expert or enthusiast)?

Clarification Edit: The question was in regard to .mkv files rather than YouTube videos.

Varaquilex

Posted 2014-10-01T20:56:09.060

Reputation: 3 620

Answers

91

Theory

Differences in video probably won't be noticeable for untrained eye. 1080p video would have to be downscaled anyway. It won't be exactly the same, though, because compression and scaling are applied in different order.

Let's assume that original video was 1080p. In this case the 720p video was first scaled, then compressed. On the other hand, 1080p clip was first compressed server-side, then scaled on your machine. 1080p file will obviously be bigger. (otherwise it would offer higher resolution, but at lower quality, ruining the visual experience and invalidating the point of using higher resolution1)

Lossy compression usually causes visual artifacts that appear as square blocks with noticeable edges when video is paused, but aren't visible when you play it with normal framerate. 1080p file will contain more square blocks (caused by compression) than 720p video, but those blocks will be of approximately the same size in both videos.

Doing simple math we can calculate that 1080p video will contain 2,25 times more such blocks, so after scaling it down to 720p those blocks will be 2.25 times smaller than in actual 720p video. The smaller those blocks are, the better quality of the final video is, so 1080p video will look better than 720p video, even on 720p screen. Resized 1080p video will appear slightly sharper than actual 720 clip.

Things get a bit more complicated if source material was bigger than 1080p. The 1080p clip is first scaled to 1080p and compressed before you play it and then scaled once again while playing. The 720p clip is scaled only once and then compressed. The intermediate scaling step which is present in 1080p video case will make its quality slightly worse2. The compression will make 720p even worse, though, so 1080p wins anyway.

One more thing: It's not only video that is compressed, but audio too. When people decide to use higher bitrate1 for video compression, they often do the same with audio. 1080p version of the same video may offer better sound quality than 720p video.


1: A bitrate is the factor that decides how good the compressed video is at the cost of file size. It's specified manually when video is compressed. It specifies how much disk space can be used for every frame (or time unit) of compressed video. Higher bitrate = better quality and bigger file. Using the same bitrate with the same framerate will produce files of (approximately) the same size, no matter what video resolution is, but the higher resolution is used, the less disk space can be spent on single pixel, so increasing output resolution without increasing bitrate can make compressed video look worse than it would with lower output resolution.

2: Try it yourself: open up a photo in any editor and scale it to a bit smaller size, then again and again, save it as PNG. Then open original photo again and scale it to the same size in one step. Second attempt will give better results.


Test

@Raestloz asked for actual videos for comparision in his comment. I couldn't find 1080p and 720p versions of the same video for comparison, so I made one.

I have used uncompressed frames from "Elephant's dream" movie (http://www.elephantsdream.org/) which are available under CC-BY 2.5. I have downloaded frames 1-6000 and converted them into videos using ffmpeg and following batch file:

ffmpeg -i %%05d.png -c:v libx264 -framerate 24 -b:v 500k -an -s 1280x720 -f mp4 _720p_500k.mp4
ffmpeg -i %%05d.png -c:v libx264 -framerate 24 -b:v 700k -an -s 1280x720 -f mp4 _720p_700k.mp4
ffmpeg -i %%05d.png -c:v libx264 -framerate 24 -b:v 1125k -an -s 1280x720 -f mp4 _720p_1125k.mp4
ffmpeg -i %%05d.png -c:v libx264 -framerate 24 -b:v 4000k -an -s 1280x720 -f mp4 _720p_4000k.mp4
ffmpeg -i %%05d.png -c:v libx264 -framerate 24 -b:v 500k -an -f mp4 _1080p_500k.mp4
ffmpeg -i %%05d.png -c:v libx264 -framerate 24 -b:v 700k -an -f mp4 _1080p_700k.mp4
ffmpeg -i %%05d.png -c:v libx264 -framerate 24 -b:v 1125k -an -f mp4 _1080p_1125k.mp4
ffmpeg -i %%05d.png -c:v libx264 -framerate 24 -b:v 4000k -an -f mp4 _1080p_4000k.mp4
  • 24 fps
  • 1080p and 720p
  • four constant bitrates for each resolution:
    • 500 kbps
    • 700 kbps
    • 1125 kbps
    • 4000 kbps

500 kbps is low enough for compression artifacts and distortions to appear on 720p video. 1125 kbps is proportional bitrate per pixel for 1080p (500 × 2.25 = 1125, where 2.25 = 1920×1080 / 1280×720). 700 kbps is intermediate bitrate to check if using bitrate much lower than proportional for 1080p makes sense. 4000 kbps is high enough to create mostly lossless video in both resolution for comparison of resized 1080p to actual 720p.

Then I have split videos back into single frames. Extracting all frames is slow and takes a lot of space (true story), so I recommend using ffmpeg's -r switch to extract every 8th frame (ie. -r 3 for 24 fps video)

I can't provide future-proof download links for videos, but these steps can be easily replicated to create clips such as mine. For the record, here are output file sizes: (should be roughly identical for both resolutions, because bitrate is constant per second)

  • 500 kbps: 13.6 MB / 13.7 MB
  • 700 kbps: 18.8 MB / 19 MB
  • 1125 kbps: 29.8 MB / 30.2 MB
  • 4000 kbps: 105 MB / 105 MB

Downloads for samples of extracted frames are available at the end of this post.

Increasing bitrate and resolution

Here's a comparison of the same region cropped from both frames after scaling to 720p (frame 2097). Look at the fingers, heads and the piece of equipment hanging from the ceiling: even going from 500 to 700 kbps makes a noticeable difference. Note that both images are already scaled to 720p.

Different pictures of frame 2097 side-by-side

Frame 3705. Notice rug's edge and the cable:

Different pictures of frame 3705 side-by-side

Frame 5697. This is an example of frame that doesn't compress very well. 1080p 700 kbps video is less detailed than 720 500 kbps clip (ear's edge). Skin details are lost on all compressed frames.

Different pictures of frame 5697 side-by-side

GIFs of all three frames, with increasing bitrate. (Unfortunately I had to use dithering because GIMP doesn't support more than 255 colors in GIF, so some pixels are a bit off.)

Constant bitrate, different resolutions

Inspired by @TimS.'s comment, here is the same region from frame 2097 with 720p and 1080p side by side.

Effect of resolution change without changing bitrate

For 500 kbps, 720p is a bit better than 1080p. 1080p appears sharper, but these details aren't actually present in uncompressed image (left guy's trousers). With 700 kbps I'd call it a draw. Finally, 1080p wins for 1125 kbps: both stills look mostly identical, but picture on the right has more pronounced shadows (pipes on the back wall and in lower right part).

Very high bitrate

@Noah asked a good question in the comments: will both images look identical with high enough bitrate? Here's 720p 4000 kbps vs. 1080p 4000 kbps vs. uncompressed frame 5697:

Comparison of high-bitrate 720p and 1080p stills to uncompressed frame

Now this is pretty subjective, but here's what I can see:

  • Left edge of the ear is pixelated in 720p, but smooth in 1080p, despite identical bitrate.
  • 720p preserves cheek skin details better than 1080p.
  • Hair looks a bit sharper in 1080p.

It's scaling that starts to play role here. One could intuitively answer that 1080p will look worse than 720p on a 720p screen, because scaling always affects quality. It's not exactly true in this case, because the codec I used (h.264, but also other codecs) has some imperfections: it creates little boxes that are visible on contrasting edges. They appear on the 1080p snapshot too (see links at the bottom), but resizing to 720p causes some details to be lost, in particular smoothes out these boxes and improves quality.

Okay, so let's calculate difference between 720p (left) and 1080p (right) vs. original frame and stretch the contract, so it's clearly visible:

Stretched difference between compressed and original frames

This image gives us even clearer vision of what is going on. Black pixels are represented perfectly in compressed (and resized to 720p) frames, colored pixels are off proportionally to intensity.

  • The cheek is way closer to original on 720p half, because scaling smoothed out skin details on the right half.
  • Ear's edge isn't that close to uncompressed pixels, but it's better in 1080p. Again, artifacts are visible on 720p half - they would appear on unresized 1080p too, but scaling smoothed them out with quite good results.
  • Hair seems to be better on 720p because it's closed to black, but in reality it looks like random noise. 1080p, on the other hand, has its distortions lining up with hair edges, so it actually emphasizes hair lines. It's probably the magic of scaling again: the "noise" increases when scaling, but it also starts to make sense.

Disclaimer

This test is purely synthetic and doesn't prove that real-life 1080p video looks better than 720p when played on smaller screen. However, it shows strong relationship between video bitrate and quality of video resized to screen size. We can safely assume that 1080p video will have higher bitrate than 720p, so it will offer more detailed frames, most of the time enhancing viewer's experience. It's not the resolution that plays most important part, but video bitrate, which is higher in 1080p videos.

Using insanely high bitrate for 720p video won't make it look better than 1080p. Post-compression downscaling can be beneficial for 1080p, because it will shape compression noise and smooth out artifacts. Increasing bitrate doesn't compensate lack of extra pixels to work with because lossy codecs aren't perfect.

In rare cases (very detailed scenes) higher resolution, higher bitrate videos may actually look worse.

What's the difference between this artificial test and real-life video?

  • I have assumed at least 40% higher bitrate for 1080p than for 720p. Looking at the results, I guess 20% would be enough to notice quality improvement, but I haven't tested it. Proportional increase in bitrate will provide much better results, even if lower resolution matches what screen uses, but it's unlikely to be used in real life. (still, it's proportional, @JamesRyan)
  • Real-life videos usually use variable bitrate (VBR). I went with 1-pass constant bitrate (CBR), hoping that it will make all unpleasant compression side-effects more obvious.
  • Different codecs can react in different ways. This test was conducted using popular h.264 codec.

Once again: I don't say that this post proves anything. My test is based on artificially-made video. YMMV for realistic examples. Still, the theory probably is true, there's nothing that would suggest it can be wrong. (except for scaling thing, but the test deals with it)

Concluding, in most cases 1080p video will look better than 720p video, no matter what screen resolution is.


Downloads:

gronostaj

Posted 2014-10-01T20:56:09.060

Reputation: 33 047

I'd agree on the potentially higher audio bitrate, but I'm not sure about the image quality. Sharper image requires more pixels to actually make it sharper, displaying an HD image on an SD display will not make any difference since the pixels needed to actually make the image sharper is just not there – Raestloz – 2014-10-02T04:58:41.417

4@Raestloz For uncompressed video or lossless codecs that's true, but lossy compression creates those square blocks I talk about, which actually act like big pixels. Scaling 1080p video down to 720p will make them smaller than in actual 720p video. (assuming that bitrate per pixel is approximately identical in both files) – gronostaj – 2014-10-02T06:28:38.943

That'd depend on the way the 720p video was produced I guess, .mkv is merely a container after all. Is there a short video where this phenomenon (blocks visibility on actual 720p vs 1080p scaled down) can be seen? My monitor is actually 1280x1024 so I can test it out (btw I don't have 1080p videos, I figured it'd be just useless for my monitor) – Raestloz – 2014-10-02T07:45:28.677

2@Raestloz I've added synthetic tests with my interpretation of results. – gronostaj – 2014-10-02T21:21:33.383

4I see a lot of wrong when it comes to answering questions like this, but your answer is excellent. Of some note is that lossy compression does not always create "blocks" -- the exact artifacting depends on many factors, but based the knowledge shown in your answer, I figure this was an intentional simplification for clarity. – Charles Burns – 2014-10-02T22:17:05.303

Is the scaling down of 1080p media more-CPU intensive than playing 720p natively? – AStopher – 2014-10-03T07:25:52.227

1@zyboxenterprises Surely it will be more _PU-intensive, the question is which processor will handle it. Playing 1080p video alone is already more demanding and scaling to 720p adds even more computational overhead. I believe it's handled by GPU, because video decoding is offloaded to GPU (on Windows it's usually done with DXVA, I think). I'm not absolutely sure about that, though. – gronostaj – 2014-10-03T08:02:18.480

Does this imply that a 720p video with a "high enough" bitrate will be virtually indistinguishable from a 1080p video with a comparably scaled bitrate? (e.g. is there some sort of bitrate "butter zone" that equalizes both formats?) – Noah – 2014-10-03T18:01:54.397

Would also be interesting to see what a 720p 2250 bitrate version of that same frame looks like. I'd expect it to be better than either of the current ones. – Tim S. – 2014-10-03T19:10:03.090

1Your bitrates are not even remotely proportional or realistic. Basically you have fabricated evidence adding an artificial ceiling on the 720p quality to support your original false assumption. Eg. Youtube bitrate limits allow 24% extra for 1080p, not over double! – JamesRyan – 2014-10-03T22:51:51.790

@Noah I have updated the answer with better tests. Tl;dr: no, high bitrate will not compensate for lacking pixels and codec imperfections. – gronostaj – 2014-10-04T23:41:45.163

@TimS. Interesting idea, I've checked if you're right. In my opinion 1080p looks a bit better with high bitrate, but it's quite subjective. I have added test frames, check what you'll see. – gronostaj – 2014-10-04T23:43:50.990

1@JamesRyan If you have a reliable way of downloading YouTube videos to compare bitrates, I'll be happy to try it (downloading 1080p is impossible since DASH streaming has been disabled). Note that OP's not asking about streaming video, but regular files. I don't call my results evidence and I have explicitly said that it's not a proof, but I have emphasized the disclaimer even more to avoid confusion. – gronostaj – 2014-10-04T23:46:37.923

You can use APNG for the animations. That'll provide much better quality than the old GIFs

– phuclv – 2014-10-05T16:26:27.123

@LưuVĩnhPhúc APNG doesn't seem to work in Chrome. GIFs aren't really as bad as I expected, I think I'll stick to them for now unless somebody knows some widely-supported alternative. – gronostaj – 2014-10-05T20:26:19.507

1I've recently seen a talk at a conference which showed that on 4K displays, in some sequences (especially lower bit rates) users actually preferred the 1080p version over the 4K one. (Kongfeng Berger, Yao Koudota, Marcus Barkowsky and Patrick Le Callet. Subjective quality assessment comparing UHD and HD resolution in HEVC transmission chains, QoMEX 2015) – slhck – 2015-06-03T10:49:28.327

5

It depends heavily on the media.

For example, if you have a 720p display and you play a youtube 1080p I can see the difference when things moves. Its just this tiny bit more detailed. Then again, someone with less trained eyes probably won't see a difference.

Aside of that 1080p requires more rendering power to display, and the quality increase, if it is there, is just not worth it. My recommendation would be to go for 720p no matter what.

LPChip

Posted 2014-10-01T20:56:09.060

Reputation: 42 190

6Show people the same image and say one is technically better, there will always be people with 'superior eyesight' who claim to see the improvement. These are the same people who can hear the 'clarity' of gold plated digital audio cables. The fact is the scaled down 1080p version will actually be slightly worse than the native 720p version. – JamesRyan – 2014-10-02T10:21:36.587

@JamesRyan It is much harder to see with a still image than it is with a moving frame. Notice how I said: when things moves. – LPChip – 2014-10-02T11:32:15.680

1

@JamesRyan The quality loss of scaling is dwarfed by the quality loss of compression. A 1080p Youtube video use a significantly higher bitrate than a 720p Youtube video, this result in a higher quality video, regardless of scaling. Ideally you'd use a 720p video with a high bitrate for 720p screens, but Youtube don't deliver such videos, so 1080p is the best there is. If you want to see artefacts I recommend using a Starling video https://www.youtube.com/watch?v=M1Q-EbX6dso There are still some visible artefacts in 1080p, but it is far better than 720p, even if scaled to a quite small window.

– aaaaaaaaaaaa – 2014-10-03T18:47:54.960

1@eBusiness A 1080p video has more pixels to encode than a 720p, the bitrate of 1080p on youtube is LOWER per pixel than 720p! And then you are throwing the additional information away. – JamesRyan – 2014-10-03T22:43:38.640

@JamesRyan There is largely no information in a Youtube 1080p video that has pixel-accuracy in the first place, the colour channel is only half resolution and the whole thing is heavily lossy compressed, the information lost by downscaling is almost purely compression noise. A good quality DVD on the other hand has a low number of pixels, but a very high quality per pixel, downscaling that would lead to significant quality loss. – aaaaaaaaaaaa – 2014-10-04T08:19:59.193

@eBusiness a DVD is not 1080p, you are talking nonsense – JamesRyan – 2014-10-04T13:33:51.670

1

  1. The 1080p video would have to be downscaled to fit the 720p screen size. The 720p video would have no scaling applied.

  2. Which version of the video that would look better really depends more on the compression levels of each video. A 2GB 720p video would be less compressed and less blocky than a 2GB 1080p video, as long as the same video settings are used with the same video codec. The 1080p video needs more memory for each frame in the video so more compression artifacts would be visible.

nscan

Posted 2014-10-01T20:56:09.060

Reputation: 11

0

The result depends on the resizing algorithm. But the default algorithm in most situations will produce a sharper image. So a 1080p video may look better on a 720p screen, provided that the 720p is the original version and was not scaled down from 1080p or a higher resolution

Scaling down has another advantage that some algorithms may use subpixel rendering which increases the effective resolution a bit


In fact scaling up then downs is how may antialiasing algorithms work, and also how fractional dpi scaling works in many OSes like iOS or macOS. For example on a screen at 150% dpi the interface will be rendered at 3 times the physical resolution, and then it'll be scaled down to a half the scaled up version, resulting in a 3/2 scaling ratio

  • To have an effective resolution of 1920×1200 macOS renders the screen at 3840x2400 then scale down to the real resolution of the screen 2880×1800
  • The plus version of iPhone 6-8 uses 300% dpi, so the interface is rendered at 1242×2208 then down sampling to 1920×1080

Further reading:

phuclv

Posted 2014-10-01T20:56:09.060

Reputation: 14 930

0

Viewing 1080p video on a 720p screen may cause the quality of the video to be distorted as it tries to work out where to cut pixels. Chances are it won't be noticeable, but there really is no benefit of 1080p videos over 720p while using a 720p screen.

Caraxian

Posted 2014-10-01T20:56:09.060

Reputation: 121

I'd be really surprised if any decent player would actually cut instead of blend pixels. And you'd need one psychedelic video source to notice distortions of blending. – KillianDS – 2014-10-03T10:15:15.707

0

Depends on your hardware and whether or not it does upscaling/downscaling. Theoretically, 1080 lines compressed down to a set/monitor that can only read 720 lines (that's what 1080/720 refers to; the number of lines in your picture) should improve the image. More is better, right? However, let's look at a 40" TV for instance. 720 lines on a 40" TV comes to 18 lines (or pixels) per inch. 1080 lines on a 40" TV comes to 27 lines per inch. Can your eye tell the difference with an object that small? Probably not. Also, you have to take into account how the video file was processed (you mentioned MKV, which is a video format created on a computer and is therefore dependant on the quality of the video card) and whether or not top-quality hardware/software was used.

So, to answer the question, most people will not notice much difference between the two videos, if at all.

Johnny Bones

Posted 2014-10-01T20:56:09.060

Reputation: 131

Actually, a 16:9 screen with 40" diagonal will be about 34.9" x 19.6", thus 1080 would be about 56 ppi, and 720 about 37 ppi. – e100 – 2014-10-03T16:48:24.120

-10

Personally I think that 1080 and 720 thing was a screwup in multiple ways. First, 1920x1080 is a 16:9 aspect ratio. Why not make it 2048*1024 so that is a perfect 2:1 aspect ratio and cuz it is well known that computers love powers of 2? For smaller TVs such as 21" or even smaller, they could just toss away every other pixel in the 2048x1024 signal, thus making an effective 1024x512 resolution which is plenty for a small TV like on a kitchen counter.

I also hate that frames with a lot of motion tend to be very blurried as that is not HD. Real HD would have freeze frames of motion shots just as sharp as non motion frames but they are not even close in quality.

But to answer your original question, it depends on many factors. Using still images as a basis for moving images, there are competing factors: starting with higher resolution and then resizing down usually yields a better quality image as noise (such as in a blue sky) is cleaned up. The quality is especially good if the downsized image is an exact 50% resolution drop in each dimension. So for example if we took a 2048x1536 still image (3MP) and downsized it to 1024x768 (3/4 MP). The other competing factor is the more stages of processing (in an attempt to save storage space of the image), usually the worse the image quality gets. For example, a JPG originally in 2048x1536 in a camera will have artifacts then when resized down and resaved as a smaller JPG, more artifacts will be introduced.

David

Posted 2014-10-01T20:56:09.060

Reputation: 109

Computers wouldn't benefit from 2:1 aspect ratio at all. Users' productivity and comfort could be hurt, though, because with such wide screen you'd have to either sit so close that you have rotate your head all the time or so far that screen covers very small area of your vertical vision. Dropping every second line of pixels is possible with any resolution and that trick has been used for years in analog TVs to fake higher resolutions than is actually transmitted, it's called interlacing. – gronostaj – 2014-10-02T06:41:12.627

8Video has to be blurred because that's how human eyes work, they see motion blur. That's why you need only 25 fps for recorded video to feel smooth, but computer games feel choppy below 40 fps: computers don't generate motion blur, but perfectly sharp still images. – gronostaj – 2014-10-02T06:44:00.030

I disagree and feel computers would benefit a lot from a 2:1 aspect ratio and it is not much wider than the existing 16:9. We would basically go from 16:9 to 16:8 aspect ratio. Instead of blurring the motion shots using a slow framerate such as 24 or 25 fps, why not "unblur" them and use a higher framerate such as 50 or 60 fps? Since the quicker frames would be more similar to previous frames, the compression of them should be better, thus balancing out the bandwidth requirements. Oh and my idea of dropping every other pixel is not just for scanlines but in the other dimension too. – David – 2014-10-02T11:51:05.733

Cameras record blurred video. High framerate cameras are more expensive and effects aren't as good as expected (eg. Hobbit was recorded with double framerate and many people experience unpleasant effects when watching it). Movies aren't meant to be watched as stills and de-blurring dynamic movement would require more expensive gear and possibly improvements in currently used codecs and compression algorithms. It would impact movie production costs and wouldn't provide and benefits for viewers. – gronostaj – 2014-10-02T12:10:11.423

12048x1024 has four times as many pixels as 1024x512; you can't just "toss away every other pixel" to go from one to the other. – Air – 2014-10-02T16:26:31.647

216:9 is in squares, David. 16:9 translates roughly to 4:3 aspect ratio (square root of 16/square root of 9) since pixels are square. 16:8 makes no sense. Please RTFM before you answer a question like this. – Johnny Bones – 2014-10-02T18:06:19.310

4

@JohnnyBones You are wrong, where did you come up with that? 16:9 is about 1.78 horizontal pixels for every vertical one while 4:3 is roughly 1.34. They are not similar at all. Take a look at the following image http://g-images.amazon.com/images/G/01/askville/2525729_2946560_mywrite/aspectratio.jpg Also read this http://en.wikipedia.org/wiki/Aspect_ratio_%28image%29

– 2013Asker – 2014-10-02T19:31:42.520

16:9 is a compromise between standard 4:3 television and ~21:9 anamorphic widescreen movies, so that when watching 4:3 or 21:9 content on a 16:9 screen, the amount of letterboxing or pillarboxing required is minimized – chiliNUT – 2014-10-05T02:52:20.233

Real HD to me would be such that any freeze frame would look very sharp, not blurred. The bitrate would have to be high enough to not appear "choppy". What is the point of HD if only the scenes with very little (or no) motion appear to be as sharp as a still image? Another annoying thing is the compression "creeping". I see this when watching a closeup of the grass like from a football game. The grass tends to "move" when it is just that the compression artifacts are changing, making it appear to move. Not a very well thought out technology in my opinion. So easy to spot flaws. – David – 2015-05-24T09:54:51.507