Thinking, Fast and Slow

Thinking, Fast and Slow is a best-selling[1] book published in 2011 by Nobel Memorial Prize in Economic Sciences laureate Daniel Kahneman. It was the 2012 winner of the National Academies Communication Award for best creative work that helps the public understanding of topics in behavioral science, engineering and medicine.[2]

Thinking, Fast and Slow
Hardcover edition
AuthorDaniel Kahneman
CountryUnited States
LanguageEnglish language
SubjectPsychology
GenreNon-fiction
PublisherFarrar, Straus and Giroux
Publication date
2011
Media typePrint (hardcover, paperback)
Pages499 pages
ISBN978-0374275631
OCLC706020998

The book summarizes research that Kahneman conducted over decades, often in collaboration with Amos Tversky.[3][4] It covers all three phases of his career: his early days working on cognitive biases, his work on prospect theory, and his later work on happiness.

The central thesis is a dichotomy between two modes of thought: "System 1" is fast, instinctive and emotional; "System 2" is slower, more deliberative, and more logical. The book delineates rational and non-rational motivations/triggers associated with each type of thinking process, and how they complement each other, starting with Kahneman's own research on loss aversion. From framing choices to people's tendency to replace a difficult question with one which is easy to answer, the book highlights several decades of academic research to suggest that people place too much confidence in human judgement and bias.[5]

The book also shares many insights from Kahneman's work with the Israel Defense Forces and with the various departments and collaborators that have contributed to his growth as a thinker and researcher.

Summary

Two systems

In the book's first section, Kahneman describes two different ways the brain forms thoughts:

  • System 1: Fast, automatic, frequent, emotional, stereotypic, unconscious. Examples (in order of complexity) of things system 1 can do:
    • determine that an object is at a greater distance than another
    • localize the source of a specific sound
    • complete the phrase "war and ..."
    • display disgust when seeing a gruesome image
    • solve 2+2=?
    • read text on a billboard
    • drive a car on an empty road
    • come up with a good chess move (if you're a chess master)
    • understand simple sentences
    • connect the description 'quiet and structured person with an eye for details' to a specific job
  • System 2: Slow, effortful, infrequent, logical, calculating, conscious. Examples of things system 2 can do:
    • brace yourself before the start of a sprint
    • direct your attention towards the clowns at the circus
    • direct your attention towards someone at a loud party
    • look out for the woman with the grey hair
    • dig into your memory to recognize a sound
    • sustain a higher than normal walking rate
    • determine the appropriateness of a particular behavior in a social setting
    • count the number of A's in a certain text
    • give someone your phone number
    • park into a tight parking space
    • determine the price/quality ratio of two washing machines
    • determine the validity of a complex logical reasoning
    • solve 17 × 24

Kahneman covers a number of experiments which purport to highlight the differences between these two thought systems and how they arrive at different results even given the same inputs. Terms and concepts include coherence, attention, laziness, association, jumping to conclusions, WYSIATI (What you see is all there is), and how one forms judgments. The System 1 vs. System 2 debate dives into the reasoning or lack thereof for human decision making, with big implications for many areas including law and market research.[6]

Heuristics and biases

The second section offers explanations for why humans struggle to think statistically. It begins by documenting a variety of situations in which we either arrive at binary decisions or fail to precisely associate reasonable probabilities with outcomes. Kahneman explains this phenomenon using the theory of heuristics. Kahneman and Tversky originally covered this topic in their landmark 1974 article titled Judgment under Uncertainty: Heuristics and Biases.[7]

Kahneman uses heuristics to assert that System 1 thinking involves associating new information with existing patterns, or thoughts, rather than creating new patterns for each new experience. For example, a child who has only seen shapes with straight edges would experience an octagon rather than a triangle when first viewing a circle. In a legal metaphor, a judge limited to heuristic thinking would only be able to think of similar historical cases when presented with a new dispute, rather than seeing the unique aspects of that case. In addition to offering an explanation for the statistical problem, the theory also offers an explanation for human biases.

Anchoring

The "anchoring effect" names our tendency to be influenced by irrelevant numbers. Shown higher/lower numbers, experimental subjects gave higher/lower responses.[3]

This is an important concept to have in mind when navigating a negotiation or considering a price. As an example, most people, when asked whether Gandhi was more than 114 years old when he died, will provide a much larger estimate of his age at death than others who were asked whether Gandhi was more or less than 35 years old. Experiments show that our behavior is influenced, much more than we know or want, by the environment of the moment.

Availability

The availability heuristic is a mental shortcut that occurs when people make judgments about the probability of events on the basis of how easy it is to think of examples. The availability heuristic operates on the notion that, "if you can think of it, it must be important." The availability of consequences associated with an action is positively related to perceptions of the magnitude of the consequences of that action. In other words, the easier it is to recall the consequences of something, the greater we perceive these consequences to be. Sometimes, this heuristic is beneficial, but the frequencies at which events come to mind are usually not accurate reflections of the probabilities of such events in real life.[8][9]

Substitution

System 1 is prone to substituting a simpler question for a difficult one. In what Kahneman calls their "best-known and most controversial" experiment, "the Linda problem," subjects were told about an imaginary Linda, young, single, outspoken, and very bright, who, as a student, was deeply concerned with discrimination and social justice. They asked whether it was more probable that Linda is a bank teller or that she is a bank teller and an active feminist. The overwhelming response was that "feminist bank teller" was more likely than "bank teller," violating the laws of probability. (Every feminist bank teller is a bank teller.) In this case System 1 substituted the easier question, "Is Linda a feminist?", dropping the occupation qualifier. An alternative view is that the subjects added an unstated cultural implicature to the effect that the other answer implied an exclusive or (xor), that Linda was not a feminist.[3]

Optimism and loss aversion

Kahneman writes of a "pervasive optimistic bias", which "may well be the most significant of the cognitive biases." This bias generates the illusion of control, that we have substantial control of our lives.

A natural experiment reveals the prevalence of one kind of unwarranted optimism. The planning fallacy is the tendency to overestimate benefits and underestimate costs, impelling people to take on risky projects. In 2002, American kitchen remodeling was expected on average to cost $18,658, but actually cost $38,769.[3]

To explain overconfidence, Kahneman introduces the concept he labels What You See Is All There Is (WYSIATI). This theory states that when the mind makes decisions, it deals primarily with Known Knowns, phenomena it has already observed. It rarely considers Known Unknowns, phenomena that it knows to be relevant but about which it has no information. Finally it appears oblivious to the possibility of Unknown Unknowns, unknown phenomena of unknown relevance.

He explains that humans fail to take into account complexity and that their understanding of the world consists of a small and necessarily un-representative set of observations. Furthermore, the mind generally does not account for the role of chance and therefore falsely assumes that a future event will mirror a past event.

Framing

Framing is the context in which choices are presented. Experiment: subjects were asked whether they would opt for surgery if the "survival" rate is 90 percent, while others were told that the mortality rate is 10 percent. The first framing increased acceptance, even though the situation was no different.[10]

Sunk-cost

Rather than consider the odds that an incremental investment would produce a positive return, people tend to "throw good money after bad" and continue investing in projects with poor prospects that have already consumed significant resources. In part this is to avoid feelings of regret.[10]

Overconfidence

This section of the book is dedicated to the undue confidence in what the mind believes it knows. It suggests that people often overestimate how much they understand about the world and underestimate the role of chance in particular. This is related to the excessive certainty of hindsight, when an event appears to be understood after it has occurred or developed. Kahneman's views on overconfidence are influenced by Nassim Nicholas Taleb.[11]

Choices

In this section Kahneman returns to economics and expands his seminal work on Prospect Theory. He discusses the tendency for problems to be addressed in isolation and how, when other reference points are considered, the choice of that reference point (called a frame) has a disproportionate impact on the outcome. This section also offers advice on how some of the shortcomings of System 1 thinking can be avoided.

Prospect theory

Kahneman developed prospect theory, the basis for his Nobel prize, to account for experimental errors he noticed in Daniel Bernoulli's traditional utility theory.[12] According to Kahneman, Utility Theory makes logical assumptions of economic rationality that do not reflect people's actual choices, and does not take into account cognitive biases.

One example is that people are loss-averse: they are more likely to act to avert a loss than to achieve a gain. Another example is that the value people place on a change in probability (e.g., of winning something) depends on the reference point: people appear to place greater value on a change from 0% to 10% (going from impossibility to possibility) than from, say, 45% to 55%, and they place the greatest value of all on a change from 90% to 100% (going from possibility to certainty). This occurs despite the fact that under traditional utility theory all three changes give the same increase in utility. Consistent with loss-aversion, the order of the first and third of those is reversed when the event is presented as losing rather than winning something: there, the greatest value is placed on eliminating the probability of a loss to 0.

After the book's publication, the Journal of Economic Literature published a thorough discussion of its take on prospect theory,[13] as well as an analysis of the four fundamental factors that it rests on.[14]

Two selves

The fifth part of the book describes recent evidence which introduces a distinction between two selves, the 'experiencing self' and 'remembering self'.

Two selves

Kahneman proposed an alternative measure that assessed pleasure or pain sampled from moment to moment, and then summed over time. Kahneman called this "experienced" well-being and attached it to a separate "self." He distinguished this from the "remembered" well-being that the polls had attempted to measure. He found that these two measures of happiness diverged.

Life as a story

The author's significant discovery was that the remembering self does not care about the duration of a pleasant or unpleasant experience. Instead, it retrospectively rates an experience by the peak (or valley) of the experience, and by the way it ends. The remembering self dominated the patient's ultimate conclusion.

"Odd as it may seem," Kahneman writes, "I am my remembering self, and the experiencing self, who does my living, is like a stranger to me."[4]

Experienced well-being

Kahneman first took up the study of well-being in the 1990s. At the time most happiness research relied on polls about life satisfaction. Having arrived at the subject from previously studying unreliable memories, the author was doubtful of the question of life satisfaction as a good indicator of happiness. He designed a question that focused instead on the well-being of the experiencing self. The author proposed that "Helen was happy in the month of March" if she spent most of her time engaged in activities that she would rather continue than stop, little time in situations that she wished to escape, and not too much time in a neutral state that wouldn't prefer continuing or stopping the activity either way.

Thinking about life

Kahneman suggests that focusing on a life event such as a marriage or a new car can provide a distorted illusion of its true value. This "focusing illusion" revisits earlier ideas of substituting difficult questions and WYSIATI.

Awards and honors

Reception

As of 2012 the book had sold over one million copies.[20] On the year of its publication, it was on the New York Times Bestseller List.[1] The book was reviewed in media including the Huffington Post,[21] The Guardian,[22] The New York Times,[3] The Financial Times,[23] The Independent,[24] Bloomberg[10] and The New York Review of Books.[25]

The book was widely reviewed in specialist journals, including the Journal of Economic Literature,[13] American Journal of Education,[26] The American Journal of Psychology,[27] Planning Theory,[28] The American Economist,[29] The Journal of Risk and Insurance,[30] The Michigan Law Review,[31] American Scientist,[32] Contemporary Sociology,[33] Science,[34] Contexts,[35] The Wilson Quarterly,[36] Technical Communication,[37] The University of Toronto Law Journal,[38] A Review of General Semantics[39] and Scientific American Mind.[40]

The book was also reviewed in an annual magazine by The Association of Psychological Science.[41]

Replication crisis

Part of the book has been swept up in the replication crisis facing psychology and the social sciences. An analysis[42] of the studies cited in chapter 4, "The Associative Machine", found that their R-Index[43] is 14, indicating essentially no reliability. Kahneman himself responded to the study in blog comments and acknowledged the chapter's shortcomings: "I placed too much faith in underpowered studies."[44] Others have noted the irony in the fact that Kahneman made a mistake in judgment similar to the ones he studied.[45]

gollark: I do run all my stuff on a Xeon E3 with 4GB RAM. Somehow it works.
gollark: Yes, also that.
gollark: What are you planning to run?
gollark: If you're worried about power, you could get dual Lwhatevers.
gollark: Just add another X5675 if you somehow end up needing 12 cores.

See also

References

  1. "The New York Times Best Seller List – December 25, 2011" (PDF). www.hawes.com. Retrieved August 17, 2014.
  2. "Daniel Kahneman's Thinking, Fast and Slow Wins Best Book Award From Academies; Milwaukee Journal Sentinel, Slate Magazine, and WGBH/NOVA Also Take Top Prizes in Awards' 10th Year". Retrieved March 10, 2018.
  3. Holt, Jim (November 27, 2011). "Two Brains Running". The New York Times. p. 16.
  4. Daniel Kahneman (October 25, 2011). Thinking, Fast and Slow. Macmillan. ISBN 978-1-4299-6935-2. Retrieved April 8, 2012.
  5. Shaw, Tamsin (April 20, 2017). "Invisible Manipulators of Your Mind". ISSN 0028-7504. Retrieved August 10, 2020.
  6. System 1 VS System 2.
  7. Tversky, Amos; Kahneman, Daniel (1974). "Judgment under Uncertainty: Heuristics and Biases" (PDF). Science. 185 (4157): 1124. Bibcode:1974Sci...185.1124T. doi:10.1126/science.185.4157.1124. Archived from the original on March 18, 2012.CS1 maint: BOT: original-url status unknown (link)
  8. Tversky, Amos (1982). "11 – Availability: A heuristic for judging frequency and probability" (PDF). In Kahneman, Daniel (ed.). Judgment under uncertainty : heuristics and biases. Science. 185. Cambridge [u.a.]: Cambridge Univ. Press. pp. 1124–31. doi:10.1126/science.185.4157.1124. ISBN 9780521240642. PMID 17835457.
  9. Tversky, Amos; Kahneman, Daniel (September 1973). "Availability: A heuristic for judging frequency and probability". Cognitive Psychology. 5 (2): 207–232. doi:10.1016/0010-0285(73)90033-9.(subscription required)
  10. Reprints, Roger Lowenstein (October 28, 2011). "Book Review: Thinking, Fast and Slow by Daniel Kahneman". Bloomberg.com. Retrieved May 27, 2016.
  11. Kahneman, Daniel (2011). Thinking, fast and slow. London: Penguin Books. pp. 14. ISBN 9780141033570. OCLC 781497062.
  12. Kahneman, Daniel; Tversky, Amos (March 1979). "Prospect Theory: An Analysis of Decision under Risk" (PDF). Econometrica. 47 (2): 263–291. CiteSeerX 10.1.1.407.1910. doi:10.2307/1914185. JSTOR 1914185. Archived from the original on November 17, 2014.CS1 maint: unfit url (link)
  13. Psychologists at the Gate: A Review of Daniel Kahneman's Thinking, Fast and Slow (PDF). 2012.
  14. Psychologists at the Gate: A Review of Daniel Kahneman's Thinking, Fast and Slow (PDF). 2012. pp. 7–9.
  15. "2011 Los Angeles Times Book Prize Winners & Finalists". Los Angeles Times. Archived from the original on April 16, 2016.
  16. "10 Best Books of 2011". The New York Times. November 30, 2011. ISSN 0362-4331. Retrieved March 10, 2018.
  17. Stein, Janice Gross; et al. "The Globe 100: The very best books of 2011". Retrieved March 10, 2018.
  18. The Economist – Books of the Year 2011.
  19. "The Best Nonfiction of 2011". Wall Street Journal. December 17, 2011.
  20. Cooper, Glenda (July 14, 2012). "Thinking, Fast and Slow: the 'landmark in social thought' going head to head with Fifty Shades of Grey". Daily Telegraph. ISSN 0307-1235. Retrieved February 17, 2018.
  21. Levine, David K. (September 22, 2012). "Thinking Fast and Slow and Poorly and Well". Huffington Post. Retrieved February 17, 2018.
  22. Strawson, Galen (December 13, 2011). "Thinking, Fast and Slow by Daniel Kahneman – review". the Guardian. Retrieved February 17, 2018.
  23. "Thinking, Fast and Slow". Financial Times. Retrieved February 17, 2018.
  24. "Thinking, Fast and Slow, By Daniel Kahneman". The Independent. November 18, 2011. Retrieved February 17, 2018.
  25. Dyson, Freeman (December 22, 2011). "How to Dispel Your Illusions". The New York Review of Books. ISSN 0028-7504. Retrieved February 17, 2018.
  26. Durr, Tony (February 1, 2014). "Thinking, Fast and Slow by Daniel Kahneman". American Journal of Education. 120 (2): 287–291. doi:10.1086/674372. ISSN 0195-6744.
  27. Krueger, Joachim I. (2012). Kahneman, Daniel (ed.). "Reviewing, Fast and Slow". The American Journal of Psychology. 125 (3): 382–385. doi:10.5406/amerjpsyc.125.3.0382. JSTOR 10.5406/amerjpsyc.125.3.0382.
  28. Baum, Howell (2013). "Review of Thinking, fast and slow". Planning Theory. 12 (4): 442–446. doi:10.1177/1473095213486667. JSTOR 26166233.
  29. Brock, John R. (2012). "Review of Thinking, Fast and Slow". The American Economist. 57 (2): 259–261. doi:10.1177/056943451205700211. JSTOR 43664727.
  30. Gardner, Lisa A. (2012). "Review of Thinking, Fast and Slow". The Journal of Risk and Insurance. 79 (4): 1143–1145. doi:10.1111/j.1539-6975.2012.01494.x. JSTOR 23354961.
  31. Stein, Alex (2013). "Are People Probabilistically Challenged?". Michigan Law Review. 111 (6): 855–875. JSTOR 23812713.
  32. Sloman, Steven (2012). "The Battle Between Intuition and Deliberation". American Scientist. 100 (1): 73–75. JSTOR 23222820.
  33. Etzioni, Amitai (2012). Kahneman, Daniel (ed.). "The End of Rationality?". Contemporary Sociology. 41 (5): 594–597. doi:10.1177/0094306112457657b. JSTOR 41722908.
  34. Sherman, Steven J. (2011). "Blink with Muscles". Science. 334 (6059): 1062–1064. Bibcode:2011Sci...334.1062S. doi:10.1126/science.1214243. JSTOR 41351778.
  35. jasper, james m. (2012). "thinking in context". Contexts. 11 (2): 70–71. doi:10.1177/1536504212446467. JSTOR 41960818.
  36. Akst, Daniel (2011). "Rushing to Judgment". The Wilson Quarterly (1976-). 35 (4): 97–98. JSTOR 41484407.
  37. Harrison, Kelly A. (2012). "Review of Thinking, Fast and Slow". Technical Communication. 59 (4): 342–343. JSTOR 43093040.
  38. Richardson, Megan Lloyd (2012). "Review of Thinking, Fast and Slow [sic, included in a set of reviews]". The University of Toronto Law Journal. 62 (3): 453–457. doi:10.1353/tlj.2012.0013. JSTOR 23263811.
  39. Vassallo, Philip (2012). "Review of Thinking, Fast and Slow". ETC: A Review of General Semantics. 69 (4): 480. JSTOR 42579224.
  40. Upson, Sandra (2012). "Cognitive Illusions". Scientific American Mind. 22 (6): 68–69. JSTOR 24943506.
  41. Bazerman, Max H. (October 21, 2011). "Review of Thinking, Fast and Slow by Daniel Kahneman". APS Observer. 24 (10).
  42. R, Dr (February 2, 2017). "Reconstruction of a Train Wreck: How Priming Research Went off the Rails". Replicability-Index. Retrieved April 30, 2019.
  43. R, Dr (January 31, 2016). "A Revised Introduction to the R-Index". Replicability-Index. Retrieved April 30, 2019.
  44. McCook, Author Alison (February 20, 2017). ""I placed too much faith in underpowered studies:" Nobel Prize winner admits mistakes". Retraction Watch. Retrieved April 30, 2019.
  45. Engber, Daniel (December 21, 2016). "How a Pioneer in the Science of Mistakes Ended Up Mistaken". Slate Magazine. Retrieved April 30, 2019.
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.