Chromatic adaptation

Chromatic adaptation is the human visual system’s ability to adjust to changes in illumination in order to preserve the appearance of object colors. It is responsible for the stable appearance of object colors despite the wide variation of light which might be reflected from an object and observed by our eyes. A chromatic adaptation transform (CAT) function emulates this important aspect of color perception in color appearance models.

An object may be viewed under various conditions. For example, it may be illuminated by sunlight, the light of a fire, or a harsh electric light. In all of these situations, human vision perceives that the object has the same color: a red apple always appears red, whether viewed at night or during the day. On the other hand, a camera with no adjustment for light may register the apple as having varying color. This feature of the visual system is called chromatic adaptation, or color constancy; when the correction occurs in a camera it is referred to as white balance.

Though the human visual system generally does maintain constant perceived color under different lighting, there are situations where the relative brightness of two different stimuli will appear reversed at different illuminance levels. For example, the bright yellow petals of flowers will appear dark compared to the green leaves in dim light while the opposite is true during the day. This is known as the Purkinje effect, and arises because the peak sensitivity of the human eye shifts toward the blue end of the spectrum at lower light levels.

Von Kries transform

The von Kries chromatic adaptation method is a technique that is sometimes used in camera image processing. The method is to apply a gain to each of the human cone cell spectral sensitivity responses so as to keep the adapted appearance of the reference white constant. The application of Johannes von Kries's idea of adaptive gains on the three cone cell types was first explicitly applied to the problem of color constancy by Herbert E. Ives,[1][2] and the method is sometimes referred to as the Ives transform[3] or the von Kries–Ives adaptation.[4]

The von Kries coefficient rule rests on the assumption that color constancy is achieved by individually adapting the gains of the three cone responses, the gains depending on the sensory context, that is, the color history and surround. Thus, the cone responses from two radiant spectra can be matched by appropriate choice of diagonal adaptation matrices D1 and D2:[5]

where is the cone sensitivity matrix and is the spectrum of the conditioning stimulus. This leads to the von Kries transform for chromatic adaptation in LMS color space (responses of long-, medium-, and short-wavelength cone response space):

This diagonal matrix D maps cone responses, or colors, in one adaptation state to corresponding colors in another; when the adaptation state is presumed to be determined by the illuminant, this matrix is useful as an illuminant adaptation transform. The elements of the diagonal matrix D are the ratios of the cone responses (Long, Medium, Short) for the illuminant's white point.

The more complete von Kries transform, for colors represented in XYZ or RGB color space, includes matrix transformations into and out of LMS space, with the diagonal transform D in the middle.[6]

CIE color appearance models

The International Commission on Illumination (CIE) has published a set of color appearance models, most of which included a color adaptation function. CIE L*a*b* (CIELAB) performs a "simple" von Kries-type transform in XYZ color space,[7] while CIELUV uses a uses Judd-type (translational) white point adaptation.[8] Two revisions of more comprehensive color appearance models, CIECAM97s and CIECAM02, each included a CAT function, CMCCAT97 and CAT02 respectively.[7] CAT02's predecessor[9] is a simplified version of CMCCAT97 known as CMCCAT2000.[10]

gollark: * via SQL
gollark: Wait, are you having your MUD store all data ever in SQL?
gollark: Well, Windows *is* quite bad.
gollark: Actually, I say many words containing h.
gollark: Bee you.

References

  1. Ives HE (1912). "The relation between the color of the illuminant and the color of the illuminated object". Trans. Illuminat. Eng. Soc. 7: 62–72. (Reprinted in: Brill, Michael H. (1995). "The relation between the color of the illuminant and the color of the illuminated object". Color Res. Appl. 20: 70–5. doi:10.1002/col.5080200112.)
  2. Hannah E. Smithson and Qasim Zaidi (2004). "Colour constancy in context: Roles for local adaptation and levels of reference". Journal of Vision. 4 (9): 693–710. doi:10.1167/4.9.3. PMID 15493964.
  3. Hannah E. Smithson (2005). "Review. Sensory, computational and cognitive components of human color constancy". Philosophical Transactions of the Royal Society. 360 (1458): 1329–46. doi:10.1098/rstb.2005.1633. PMC 1609194. PMID 16147525.
  4. Karl R. Gegenfurtner, L. T. Sharpe (1999). Color Vision: From Genes to Perception. Cambridge University Press. ISBN 0-521-00439-X.
  5. Gaurav Sharma (2003). Digital Color Imaging Handbook. CRC Press.
  6. Erik Reinhard (2006). High Dynamic Range Imaging: Acquisition, Display, and Image-Based Lighting. Morgan Kaufmann. ISBN 0-12-585263-0.
  7. Luo, Ming Ronnier (2015). "CIE Chromatic Adaptation; Comparison of von Kries, CIELAB, CMCCAT97 and CAT02". Encyclopedia of Color Science and Technology. Springer Berlin Heidelberg: 1–8. doi:10.1007/978-3-642-27851-8_321-1. ISBN 978-3-642-27851-8.
  8. Judd, Deane B. (January 1940). "Hue saturation and lightness of surface colors with chromatic illumination". JOSA. 30 (1): 2–32. doi:10.1364/JOSA.30.000002.
  9. editor, Christine Fernandez-Maloigne (2013). Advanced color image processing and analysis (PDF). New York, NY: Springer. p. 33. ISBN 9781441961891.CS1 maint: extra text: authors list (link)
  10. Li, Changjun; Luo, M. Ronnier; Rigg, Bryan; Hunt, Robert W. G. (February 2002). "CMC 2000 chromatic adaptation transform: CMCCAT2000". Color Research & Application. 27 (1): 49–58. doi:10.1002/col.10005.

Further reading

This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.