What sense does it make for "sharpness" to be adjustable on a monitor?

93

15

Modern monitors often have a "sharpness" setting.

But I don't really understand how it makes sense for such a setting to exist.

The software asks the monitor to display a particular pattern of 32-bit RGB values, right?
i.e. The OS might ask the monitor every frame to display a particular 1920×1080×32 bitmap.

But adjusting "sharpness" means letting nearby pixel values affect each other, which would seem to imply that the input is no longer being faithfully represented... meaning it would no longer be displaying what it is asked to display, which doesn't make sense. So I just don't see where this leaves any logical room for sharpness adjustment.

Where exactly does the degree of freedom for adjusting sharpness come from?

user541686

Posted 2018-01-04T21:35:25.973

Reputation: 21 330

30Funny, last time I saw sharpness was on a CRT. Back then it let you control the timing between red, green, and blue pulses to line them up appropriately. If it was slightly too fast or too slow, the three phosphors would be kind of smeared. If you lined them up just right, the image was as sharp as possible. Obviously that doesn’t apply at all to modern displays. – Todd Wilcox – 2018-01-05T04:31:06.530

8You're putting on a lot of answers that they don't answer your question it might help if you said why they don't answer the question. Its worth noting that the monitor is not required by law to display what the software asks it to. It could for example change every third pixel to green, or invert them. So the reason that the monitor can "fuzzify" images is because enough customers wanted it, it could just as easily stretch the image but no-one wanted that, so they don't. So there are only 2 types of answer: how is this technically done, or why do people want this? – Richard Tingle – 2018-01-05T10:25:14.783

You have two questions : "What sense does it make...?" and "Where exactly does the degree of freedom...". I think that is causing confusion – Christian Palmer – 2018-01-05T10:31:17.170

3@RichardTingle: I suppose the answer to "why do people want this" could potentially also answer my question, yeah. Though I'm more trying to understand what sense this makes from a technical perspective. To give you an example, it actually makes sense to me to have a sharpness parameters for displaying RAW photos, because the mapping from the sensor data to a pixels is inherently an under-determined problem (i.e. doesn't have a unique solution) with degrees of freedom like sharpness, noise level, etc. But bitmaps are pixels, so where is the room for adjusting things like "sharpness"? – user541686 – 2018-01-05T10:32:49.273

@ChristianPalmer: Possibly? They're the same question to me though. See here as to how/why.

– user541686 – 2018-01-05T10:35:43.357

2@ToddWilcox: For what it's worth, at least HP EliteDisplay monitors (LCD) have a "Sharpness" setting, and it works even with a digital input (i.e. not VGA). – sleske – 2018-01-05T11:35:42.700

2I might add that I have had to adjust "Sharpness" to compensate for users who have eyeglasses or contacts in some cases. – PhasedOut – 2018-01-05T19:13:40.500

Digital cameras also let you adjust the "sharpness" of the image. Assuming the camera is capable of taking an in-focus image, why would this be needed? User preference... – JPhi1618 – 2018-01-05T20:42:34.197

1Most modern displays with analog inputs can adjust the clock and phase automatically to ensure that the image is sharp, and I assume this is what you're referring to. If it's not an analog input, then it's almost certainly some form of post-processing. – bwDraco – 2018-01-06T18:52:09.283

Another topic that should not be forgotten here is accessibility. You might not care about being able to set sharpness/brightness/contrast but there are certainly people who require being able to set it to make sure that they can see the image correctly. – Bobby – 2018-01-07T10:02:38.493

Too much sharpness can be tiring to the eyes, especially if you have sensitive eyes. Personally, I soften the appearance most of the time; maybe because I have pretty sensitive eyes (able to see light flicker at frequencies invisible to many/most other persons), and softening might be a method to reduce sensoric load. – phresnel – 2018-01-08T10:34:47.703

One-word answer: legacy. Longer: CRTs needed the setting, now it's a matter of aesthetics. – Chuck Adams – 2018-01-09T23:28:38.623

Answers

81

Per https://www.cnet.com/uk/how-to/turn-down-your-tv-sharpness-control/ , "sharpness" on an LCD is part of the post-processing.

Even leaving out rescaling/upsampling (e.g. if you try to display an SD signal on an HD monitor), and the complexities of colour calibration, the monitor does not always display the image as given. This is an unfortunate side effect of marketing.

Monitor manufacturers like to distinguish their product from other products. From their point of view, if you feed the same signal to their monitor and a cheaper competitor, and it looks identical, this is bad. They want you to prefer their monitor. So there's a bunch of tricks; usually out of the box the brightness and contrast are wound right up beyond what is sensible. The "sharpness" is another trick. How do you make your picture look sharper than a competitor that is displaying the picture exactly as sent? Cheat.

The "sharpness" filter is effectively that used in Photoshop and similar programs. It enhances the edges so they catch the eye.

pjc50

Posted 2018-01-04T21:35:25.973

Reputation: 5 786

7+1 thanks, I think you're getting pretty close to answering the heart of my question! Based on your answer it seems the direct answer would be "no, it doesn't make any scientific sense, but they don't care and do it anyway because it helps with marketing". – user541686 – 2018-01-05T10:11:15.530

8Most LCDs I've seen only allow changing the sharpness for analogue signals, which makes perfect sense, of course. Connect the display through DVI/HDMI, and you get no sharpness controls (in fact, you lose most of the signal controls, since they only really make sense for non-perfect signals). It's also common in LCDs that are hybrid monitor/television displays - since TVs have those features (for better or worse), so do these displays just to keep feature parity. If you think it's just pointless marketing, try watching a VHS on a display that assumes a pixel-perfect signal :) – Luaan – 2018-01-05T13:08:48.890

3It's not just marketing. The "pixels" aren't in straight lines, nor are they single units like the name pixel would suggest. – HackSlash – 2018-01-05T17:17:53.360

2@HackSlash: How do you know the pixels aren't in straight lines? Have you seen my monitor? – user541686 – 2018-01-06T05:43:57.573

26The monitor can never display the image as given, and it's not because of manufacturing. Colors, brightness, blacks, resolutions - all vary from monitor to monitor. A digital signal might say, 'light pixel x at 50% brightness.' What is 50% brightness? Or what is 100 out of 200 brightness? Absolutely depends on the monitor hardware and programming. What's blue? or Yellow? That's why there are adjustments to begin with. So a user can make the picture look the way they think it should look or to enjoy. The idea that a signal is represented 100% accurately on any monitor is ridiculous. – Appleoddity – 2018-01-06T16:55:26.280

1@Mehrdad Well as soon as the image has a different resolution than the monitor it can't be represented "as is". Also, different monitors will reproduce colours differently (because of different technology, quality of the panel, production variance e.tc) so there have to be options to calibrate the monitor. – Jannik Pitt – 2018-01-07T21:19:11.657

"no, it doesn't make any scientific sense, but they don't care and do it anyway because it helps with marketing" Well said. There is SO many questions that same sentence answers... (: – xDaizu – 2018-01-08T11:59:42.747

4Some of the comments are technically correct (like "The monitor can never display the image as given"), but the answer is correct that what's going on is filtering (& that it's done for the sake of deception). Moreover, it's filtering with some negative coefficients, which inherently introduces hideous artifacts; once you realize this you'll never be able to look at it again. :-) – R.. GitHub STOP HELPING ICE – 2018-01-08T18:46:40.150

I don't need to see your monitor to know that you have colored multiple color phosphors inside each "pixel". It is unfortunate that this answer was accepted because it is wrong. – HackSlash – 2018-01-15T17:00:05.433

@HackSlash there are plenty of non-CRT monitors that don't have phosphors and yet offer a sharpness adjustment. – pjc50 – 2018-01-15T20:45:58.987

@pjc50 Can you cite one example? – HackSlash – 2018-01-15T21:09:47.053

LCDs aren't phosphor-based (unless you count the white LED backlight, which doesn't make sense). The rarer OLED aren't phosphor-based either. Your answer re: subpixel rendering - isn't this normally done on the computer side, where it's actually adjustable? – pjc50 – 2018-01-16T13:09:58.393

67

Original Question: Where exactly does the degree of freedom for adjusting sharpness come from?

Sharpness is directly related to the type of signal and content you are viewing. Movies typically look better when sharpness is turned down and the pixels are allowed to blur together a bit. On the other hand, a computer display would want high sharpness for clear text and sharp images. Video games are another example where higher sharpness is better. Low quality TV signals also can be enhanced with sharpness controls.

Being monitors can be used for displaying a computer screen, or movie, or virtually any video source, sharpness is still a useful setting.

https://www.crutchfield.com/S-biPv1sIlyXG/learn/learningcenter/home/tv_signalquality.html

EDIT: The OP has indicated in comments that this does not answer the question.

OP: Where in the problem is there room for any adjustment? Like if I tell you x = 1 and y = 2, and then say "oh, and I want x - y = 3". That makes no sense.

The process of converting a live image/video to electrical analog/digital signals, transmitting over some medium, and recreating that image on a display device is NEVER a 1 to 1 process.

Signal noise, compression loss, manufacturing and equipment variations, cabling/signal type, and other factors come in to play. All the adjustments on a monitor are designed to work together to give the end user the highest quality viewing experience - according to the end user. The interpretation is entirely subjective.

OP: This answer does not answer the question of why have the viewer adjust the sharpness when this is already defined by the content creator (be it Spielberg or Excel).

If we are to follow this logic, then why do monitors need or have ANY adjustments at all? The answer is that what we see on the screen is not a 100% accurate representation of the original data.

Appleoddity

Posted 2018-01-04T21:35:25.973

Reputation: 9 360

13This doesn't answer my question at all... I wasn't asking what sharpness is. – user541686 – 2018-01-05T08:54:45.823

5@Martijn: That wasn't my question... – user541686 – 2018-01-05T09:19:14.033

13@Mehrdad Can you rephrase your question, then? It seems multiple answerers (and their voters) do not understand it the way you do. I also took it as "why do LCDs support sharpness?" (which is answered by this one), or perhaps "how do LCDs support sharpness?" (which is answered by Aganju). If your question is neither of those, you should clarify it. – Angew is no longer proud of SO – 2018-01-05T09:37:33.177

@Angew: I'm unsure how to rephrase "Where exactly does the degree of freedom for adjusting sharpness come from?"... it is about as clear as I can think of how to make it. When you read that sentence do you really find it equivalent to "Why do LCDs support sharpness?"? – user541686 – 2018-01-05T09:43:56.207

3@Mehrdad I actually found it closer to "how do LCDs support sharpness", which was explained by Aganju but still rejected by you. – Angew is no longer proud of SO – 2018-01-05T09:46:33.507

2@Angew: That is most definitely not what I am asking. I am asking what sense it makes to adjust sharpness when the monitor is already fully specifying the image via the RGB values. Where in the problem is there room for any adjustment? Like if I tell you x = 1 and y = 2, and then say "oh, and I want x - y = 3". That makes no sense. You don't get to arbitrarily control x - y when you already specified x and y. It just makes zero sense. It's the same problem I'm trying to understand here. Where does that flexibility to specify sharpness (analogous to x - y above) come from? That's my question. – user541686 – 2018-01-05T09:55:37.710

@Mehrdad So are you asking why the sharpness is done in the HW of the monitor, instead of some SW (or potentially HW) setting in the computer before it sends the signal through the cable? – Angew is no longer proud of SO – 2018-01-05T09:58:10.947

@Angew: Not exactly, but at least that's a bit closer to what I'm asking... – user541686 – 2018-01-05T09:59:11.637

It's effectively the same as the photoshop "sharpen" filter - see my answer. – pjc50 – 2018-01-05T10:08:55.123

But in the case of "Movies typically look better when sharpness is turned down " why isn't the file itself desharpened? Since its a software effect anyway (all be it software on the monitor) – Richard Tingle – 2018-01-05T10:30:49.427

@RichardTingle Also my question. Sharpness should be determined by the source. I imagine the sharpness in a grey action packed scene to be set lower in post production than a pan of a sunlit Grand Canyon. This answer does not answer the question of why have the viewer adjust the sharpness when this is already defined by the content creator (be it Spielberg or Excel). – LVDV – 2018-01-05T10:52:14.360

How does this in any way shape or form answer the question? – Frisbetarian – 2018-01-05T13:04:56.790

2@Mehrdad your question has been answered. What you’re refusing to acknowledge is that monitors usually don’t display the image in the way the original signal intended it. Spielberg can’t guess what his movie will look like from one monitor to the next. Not to mention an analog signal will never be a pixel for pixel representation of the original signal, if that happens to come in to play. You might as well have asked why does the monitor have a contrast, brightness or color temperature setting. Why does the monitor have any image adjustment at all, if your logic is to make any sense. – Appleoddity – 2018-01-05T13:22:29.693

@Mehrdad because monitors are already not accurate. So by allowing the users to change some values they can avoid complaints. Appleoddity answered this correctly. – LateralTerminal – 2018-01-05T16:48:13.297

3The cable and signal on modern monitors is digital. It either gets perfect transmission or bad data. There is no possible signal loss or degradation like you have in an analog signal. Inside the monitor the digital signal is converted in to something the panel understands and that is usually digital too now. The only analog part is the brightness levels of each element and those are driven by a digital value. – HackSlash – 2018-01-05T17:08:05.550

@HackSlash - agreed, but that is only one aspect of reproducing an accurate image. And, if I am to use the OPs terminology - "Spielberg" doesn't know if you're using analog or digital. VGA is still very much in use. But, even if we are to focus on just the digital signal and the monitor - even the same exact model of monitor, side by side, may not look the same. Manufacturing variations alone cause differences. Digital signals are also compressed, maybe not from computer to screen, but from camera -> transmission medium -> screen, they most certainly are. – Appleoddity – 2018-01-05T17:14:42.897

1We are talking about sharpness in the display, after the display received a perfect instruction of the exact value of each pixel it should display. All the things you are talking about happen before the LCD panel receives the perfect image signal. Please read my answer about subpixel rendering to understand why the monitor can't just display the image it was given. The answer is that it doesn't have single pixels. It's trying to reproduce the image with subpixels. There are 3 or 4 elements behind each "pixel" in a digital monitor that are in different physical locations. That's the answer. – HackSlash – 2018-01-05T17:22:48.147

@HackSlash I'm sorry you're just wrong. 99% of monitors are NOT accurate! It's even worse when you work in the print industry it is immediately evident. Why do you think we still have to use Pantone colors for accuracy after all these years? You think we want to? – LateralTerminal – 2018-01-05T17:37:31.307

@LateralTerminal did you read my post about subpixel rendering? It explains WHY monitors are not accurate. I'm not sure what your misunderstanding is. – HackSlash – 2018-01-05T17:39:37.563

1@HackSlash ah I misread your comment totally. I upvoted your answer. That's the real answer like you said. The pictures really help explain the answer better than text does. – LateralTerminal – 2018-01-05T17:42:00.910

1Why is this answer at the top? It not only answers the wrong question, but it's not even correct (it strongly implies small signal loss affects digital output quality, which is false) – BlueRaja - Danny Pflughoeft – 2018-01-05T19:46:55.853

@BlueRaja-DannyPflughoeft that is actually not what it implies at all. Please re-read the answer for a better understanding and note in the previous paragraph I say “digital/analog” signals. And let’s not forget, if you truly understand electronics, there is no such thing as a digital piece of silicon. So yes, somewhere within the entire system, even a “digital” signal can be distorted. Your brain is not digital and your eyes don’t see 1s and 0s. – Appleoddity – 2018-01-05T21:37:49.860

@Appleoddity that is utter nonsense. Everything is analog. That makes the word "digital" completely meaningless. This answer is still wrong. We have words like digital for a reason. – HackSlash – 2018-01-15T17:02:05.633

@Appleoddity, in this last comment you're misunderstanding the nature of digital line discipline. As an oversimplified example (seems warranted here), a logical "0" (false) in TTL can be say 0v, 0.05v or 1.1v, among higher values. The variance can be from noise. But it's still a 0. Further, your assertion that we "don't see 1's and 0's" is true, and I'll further assert that we simply don't see like a CCD/CMOS array or other raster, however that is entirely a different subject. – tgm1024--Monica was mistreated – 2019-06-18T14:19:57.170

@tgm1024 you’re digging up quite an old thread my friend. Anyways, if I think back, the point I am trying to make does not conflict with your comment. I agree with it. The point is that the transistor on the actual pixel, or the liquid crystal itself has variations. Other analog circuitry responsible or processing these signals can be less than accurate in reproducing it. Yes, I expect data to arrive on each end of a digital communication unmodified. But, by the time light is produced and it reaches your retina, it is no longer digital. – Appleoddity – 2019-06-18T14:24:13.653

36

You are correct that for a perfect reproduction on the input, the monitor should simply present each pixel as it is delivered.

However, your eyes (and your brain) don't see pixels as separate entities, they see a picture formed from pixels. Depending on what it represents, a picture looks 'better' (more attractive) if parameters are intentionally 'falsified'.

Sharpness typically increases the contrast at color change edges, for example, a letter in this text is represented by rows of pixels, one line might look (simplified) 2-2-2-2-7-7-7-2-2-2-2 where 2 is light gray, and 7 is dark grey. Increasing the 'sharpness' increases the brightness falloff at the edge, so the last 2 before the first 7 becomes even lighter (= 1), and the first 7 after the last 2 becomes even darker (=8). repeat for the other edge, and you get 2-2-2-1-8-7-8-1-2-2-2. This will look a lot 'sharper' to your eyes.
This is done in both dimensions, and a bit more sophisticated, but that should explain you the base idea.

Edit: I thought I made that clear in my answer, but the OP claims he didn’t understand it:
OP Question: ‘what sense does it make’ -> Answer: it appears sharper to your brain.
Many people want that; if you don’t care for it, don’t use it.

Aganju

Posted 2018-01-04T21:35:25.973

Reputation: 9 103

18@Mehrdad Then what is your question? You've gotten an answer for "what sense [usability wise] does it makes for sharpness to be adjustable on a monitor" from Appleoddity. You've gotten an answer to "what sense [technology wise] does it makes for sharpness to be adjustable on a monitor" from Aganju. If neither of those are your intended question, then you need to rewrite it so that your asked question matches your expected question. – Birjolaxew – 2018-01-05T10:49:06.230

6@Mehrdad From your comments your question seem to be "why are monitors allowed to change sharpness when the computer already tells them exactly what to display" - which is answered by Appleoddity. – Birjolaxew – 2018-01-05T10:55:24.653

This answer is functionally the same as the answer given by pjc50. Both are wrong. – HackSlash – 2018-01-15T17:03:20.963

33

The answer is that a pixel is not what you think it is. There is not a 1 to 1 correlation between digital pixels and physical pixels due to "Subpixel Rendering". The way colors are displayed is different in each monitor but most LCD monitors have distinct RED, GREEN, and BLUE elements arranged in a triangle. Some additionally have a white pixel making a quad of elements per "pixel".

enter image description here

Thus, not all layouts are created equal. Each particular layout may have a different "visual resolution", modulation transfer function limit (MTFL), defined as the highest number of black and white lines that may be simultaneously rendered without visible chromatic aliasing.

Monitor drivers allow renderers to correctly adjust their geometry transform matrices in order to correctly compute the values of each color plane, and take the best profit of subpixel rendering with the lowest chromatic aliasing.

The "sharpness" on your monitor reduces the natural blending algorithm used to make lines appear to be contiguous when they are not. Turning the sharpness up will increase chromatic aliasing while producing cleaner lines. Reducing the sharpness will give you better color blending and smooth the lines that fall between the subpixel dot pitch.

For more detailed information, see this article: https://en.wikipedia.org/wiki/Subpixel_rendering

HackSlash

Posted 2018-01-04T21:35:25.973

Reputation: 3 174

17

You're absolutely right that setting sharpness on your monitor somewhat "distorts" the image from the pixel-accurate data as sent by the computer (or whatever is attached to the other end of the video cable). However, it allows the user to improve their visual experience if the sharpness of the pixel-accurate data being sent does not correspond to their desired sharpness in the image they're viewing.

So the monitor is in effect not doing this:

  1. Receive bitmap from cable
  2. Render bitmap
  3. Goto 1.

but this:

  1. Receive bitmap from cable
  2. Modify bitmap based on user's preferences
  3. Render bitmap
  4. Goto 1.

So the degree of freedom for adjusting sharpness is explicitly added in by the monitor manufacturer, for the purpose of improving user experience.

Angew is no longer proud of SO

Posted 2018-01-04T21:35:25.973

Reputation: 356

1Comment from slebetman on his own answer seems to confirm this as a user demand: "I just googled LCD sharpness adjustment in case there were any LCD screens had them. What I found were hilarious questions from people who wanted to adjust sharpness on LCD screens when they were first introduced" – LVDV – 2018-01-05T11:10:25.097

13

The software asks the monitor to display a particular pattern of 32-bit RGB values, right? i.e. The OS might ask the monitor every frame to display a particular 1920×1080×32 bitmap.

That's not how VGA works at all. At the monitor level there are no pixels at all.

How displays traditionally worked before the age of LCD is this:

  1. Software asks the device driver to display a bitmap image

  2. Device driver splits the image into three waveforms for R, G and B. That's right, waveforms! Exactly like audio waveforms. Now, these waveforms have a specific format because while audio is 1d pictures are 2d.

  3. The analog signal for lines on the screen are sent to the monitor.

The monitor never sees a pixel, it only sees lines.

  1. The monitor spits out electrons moving at nearly light speed from three electron guns and the beam is deflected by controlling group of electromagnets causing them to paint the entire screen.

Here is where the sharpness control comes in.

Due to manufacturing tolerances the electron beams almost never converge correctly and produce blurry pictures right off the assembly line. In the really old days it is up to you, the person who bought the monitor, to adjust the sharpness at home. Later more modern of these ancient displays have automatic adjustment process at the factory but the sharpness adjustment must still be built in for the process to work.

So the answer is really simple. The sharpness adjustment is there to ensure the picture on the displays are sharp.

slebetman

Posted 2018-01-04T21:35:25.973

Reputation: 547

4I just googled LCD sharpness adjustment in case there were any LCD screens had them. What I found were hilarious questions from people who wanted to adjust sharpness on LCD screens when they were first introduced – slebetman – 2018-01-05T09:20:08.250

The LCD screen I use has a sharpness parameter, which is what made me ask this question. The bit about CRT displays is nice but unfortunately doesn't seem relevant. – user541686 – 2018-01-05T09:27:35.733

@Mehrdad Is it a VGA LCD screen or pure DVI/HDMI? – slebetman – 2018-01-05T09:37:20.707

I use VGA, but I believe it also supports DVI. It's not clear to me what the use of waveforms rather than digital signals has to do with letting the usre adjust sharpness. (?) – user541686 – 2018-01-05T09:47:02.560

3There are no pixels in the waveforms, only lines; a certain amount of blurring is unavoidable as the signal is transmitted. Usually it's not visible, but at HD I can usually tell the difference between a VGA signal and a HDMI/DVI/DP signal. – pjc50 – 2018-01-05T10:07:55.930

@pjc50: Right, but I'm asking about the intentional sharpening (or blurring) added by the user, not the one that occurs unavoidably during signal transmission. – user541686 – 2018-01-05T10:18:31.633

@Mehrdad It's still relevant, since you can use the settings to tweak the blurring/sharpening to be more comfortable for you in particular. Different people have different preferences, and different signal sources have different distortions. This is especially true for people with visual impairments, old people, old analogue signal sources... I kind of expected you to be using VGA, since on most displays I've seen the sharpness setting is disabled for digital signals - it makes perfect sense for VGA/composite etc. since the signal isn't anywhere near to pixel perfect. – Luaan – 2018-01-05T13:14:12.100

This answer is only true of CRT displays. – HackSlash – 2018-01-05T17:11:16.777

6

On a (digital) TV, sharpness controls a peaking filter that enhances edges. That is not so useful on a display if used as a computer monitor.

In the previous century, on a high-end analog CRT monitor, sharpness may have controlled the focus voltage of the electron gun. This affects the spot size with which the picture is drawn. Set the spot size too small (too sharp) and the line structure becomes too visible. Also there may be annoying "Moiré" interference with the structure of the shadow mask. The optimum setting depends on the resolution (sample rate) of the picture, as many CRT monitors were capable of multiple resolutions without scaling (multi-sync). Set it just sharp enough.

High-end CRT TVs had Scan Velocity Modulation, where the scanning beam is slowed down around a vertical edge, and also a horizontal and vertical peaking filters and perhaps a horizontal transient improvement circuit. Sharpness may have controlled any or all.

Sharpening in general enhances edges by making the dark side of the edge darker, the bright side brighter, and the middle of the edge steeper. A typical peaking filter calculates a 2nd order differential, in digital processing e.g. (-1,2,-1). Add a small amount of this peaking to the input signal. If you clip off the overshoots then it reduces to "transient improvement".

On some digital devices, the sharpness of a scaler may be controlled, e.g. in my digital satellite TV receivers. This sets the bandwidth of the polyphase filters of a scaler, which converts from a source resolution to the display resolution. Scaling cannot be perfect, it is always a compromise between artefacts and sharpness. Set it too sharp and annoying contouring and aliasing are visible.

This may be the most plausible answer to your question, but only if the monitor is scaling. It would do nothing for an unscaled 1:1 mode.

Source: 31 years of experience in signal processing for TV.

StessenJ

Posted 2018-01-04T21:35:25.973

Reputation: 241

This post is 100% correct for analog tech like CRTs. The OP is asking about sharpness on an all digital LCD display. He thinks, like you do, that "it would do nothing on an unscaled 1:1 mode" but in fact it does. See my answer about subpixel rendering in LCD displays. – HackSlash – 2018-01-05T17:15:46.343

I am not aware of any monitor applying subpixel rendering, except maybe a Pentile portable display. It is usually done by ClearView font rendering software on the PC. – StessenJ – 2018-01-06T21:19:50.540

1

This PhD thesis treats subpixel rendering very well, in Ch.3: https://pure.tue.nl/ws/files/1861238/200612229.pdf . Essentially subpixel rendering deals with a color convergence error of +/- 1/3 pixel. CRTs don't need it, it is done implicitly by the shadow mask (sampling).

– StessenJ – 2018-01-06T21:28:49.293

In that paper the process of assembling subpixels in to a coherent "pixel" is called "spatial reconstruction". They call the "pixel" an "aperture" when talking about the subpixels working together. The subpixels are color phosphors and are clearly shown on page 28. These phosphors are not always used as a clear set. You could use the red phosphor from one set or any adjacent set depending on the "sharpness" of the lines. – HackSlash – 2018-01-15T17:16:41.480

5

It doesn't make sense. Or at least it doesn't on most LCD monitors. You will almost always want your "sharpness" set to 0, depending on the monitor or TV (some will blur the signal at 0, so the real unfiltered setting might be somewhere in the middle), otherwise, it will apply an edge enhancement filter, which makes the darker side of an edge darker and the lighter side lighter. This is especially noticable on cartoons and text. Your mileage may vary, but I think it looks bad in nearly every case.

This is a lossy, irreversible filter that you will probably not want to be activated. Your computer is sending pixel-perfect data, so "sharpness" and blurring filters are generally undesirable.

Also note that the "sharpness" filter/setting is a misnomer. It is impossible to make an image sharper (i.e. having more detail), only less detailed. The only way to get a sharper image is to use a higher definition source image.

Beefster

Posted 2018-01-04T21:35:25.973

Reputation: 159

5

Sharpness settings exist on LCD panels because manufacturers think digital effects will sell more monitors and TV's. Rather than faithfully represent the input from the computer, the manufacturer gives the user options to tweak the picture to suit personal tastes, however poor those tastes may be.

"Sharpness" is relevant for analog signals (like VGA) and for CRT displays, where the signal is represented by waveforms at some point. Because analog tends to be imprecise, sharpness settings allow calibration for tolerances and compensation for imperfections in analog display output and signal transmission.

Sharpness ought to be irrelevant on LCD panels using DVI, HDMI, and other "pixel-perfect" data sources with a 1:1 resolution mapping. Yes, sharpness distorts the picture in this scenario. Displays in boxes stores often have sharpness and other digital filters cranked to extremes to appear more dramatic than the surrounding displays. Some consumers might actually want these effects because they have grown accustomed to the filters' effects or because they are trying to compensate for a poor-quality LCD panel that looks bad to the eye at native output. Sharpness might also be relevant when using a digital signal that must be resized because the source and display have different resolutions.

Overall, you probably want sharpness set to Off or 0 on a modern LCD display with a 1:1 digital signal.

http://hifi-writer.com/wpblog/?page_id=3517 and https://forums.anandtech.com/threads/why-does-an-lcd-tv-with-hdmi-input-need-a-sharpness-control.2080809/

bendodge

Posted 2018-01-04T21:35:25.973

Reputation: 131

4

Many monitors can accept a video signal that doesn't have the same resolution as the panel, and attempt to scale it as appropriate. If a monitor which is 1280 pixels wide is called upon to display an image which is 1024 pixels wide, and the source material consists of black and white stripes that are one pixel wide, the display would likely show a repeating 5-pixel pattern. On a scale of 0-4, the pattern would likely be 03214. If the black and white stripes in the original are "meaningful", showing them on as above may be helpful. On the other hand, the 5-pixel repeating pattern would be a distraction which isn't present in the original. Adding some blur to the image would reduce the aliasing effects of scaling.

supercat

Posted 2018-01-04T21:35:25.973

Reputation: 1 649

-1

Different settings are good for differnt content. These settings could also be changed on the source, but I and probably many others don't know where you can change the sharpness setting on a PC.

So there is an easy to access menu on the monitor, where sharpness can be changed.

Christian

Posted 2018-01-04T21:35:25.973

Reputation: 117