Thinking, Fast and Slow

Major New York Times bestseller
Winner of the National Academy of Sciences Best Book Award in 2012
Selected by the New York Times Book Review as one of the ten best books of 2011
A Globe and Mail Best Books of the Year 2011 Title
One of The Economist's 2011 Books of the Year
One of The Wall Street Journal's Best Nonfiction Books of the Year 2011
2013 Presidential Medal of Freedom Recipient
Kahneman's work with Amos Tversky is the subject of Michael Lewis's The Undoing Project: A Friendship That Changed Our Minds

In the international bestseller, Thinking, Fast and Slow, Daniel Kahneman, the renowned psychologist and winner of the Nobel Prize in Economics, takes us on a groundbreaking tour of the mind and explains the two systems that drive the way we think. System 1 is fast, intuitive, and emotional; System 2 is slower, more deliberative, and more logical. The impact of overconfidence on corporate strategies, the difficulties of predicting what will make us happy in the future, the profound effect of cognitive biases on everything from playing the stock market to planning our next vacation—each of these can be understood only by knowing how the two systems shape our judgments and decisions.

Engaging the reader in a lively conversation about how we think, Kahneman reveals where we can and cannot trust our intuitions and how we can tap into the benefits of slow thinking. He offers practical and enlightening insights into how choices are made in both our business and our personal lives—and how we can use different techniques to guard against the mental glitches that often get us into trouble. Winner of the National Academy of Sciences Best Book Award and the Los Angeles Times Book Prize and selected by The New York Times Book Review as one of the ten best books of 2011, Thinking, Fast and Slow is destined to be a classic.

1100169801
Thinking, Fast and Slow

Major New York Times bestseller
Winner of the National Academy of Sciences Best Book Award in 2012
Selected by the New York Times Book Review as one of the ten best books of 2011
A Globe and Mail Best Books of the Year 2011 Title
One of The Economist's 2011 Books of the Year
One of The Wall Street Journal's Best Nonfiction Books of the Year 2011
2013 Presidential Medal of Freedom Recipient
Kahneman's work with Amos Tversky is the subject of Michael Lewis's The Undoing Project: A Friendship That Changed Our Minds

In the international bestseller, Thinking, Fast and Slow, Daniel Kahneman, the renowned psychologist and winner of the Nobel Prize in Economics, takes us on a groundbreaking tour of the mind and explains the two systems that drive the way we think. System 1 is fast, intuitive, and emotional; System 2 is slower, more deliberative, and more logical. The impact of overconfidence on corporate strategies, the difficulties of predicting what will make us happy in the future, the profound effect of cognitive biases on everything from playing the stock market to planning our next vacation—each of these can be understood only by knowing how the two systems shape our judgments and decisions.

Engaging the reader in a lively conversation about how we think, Kahneman reveals where we can and cannot trust our intuitions and how we can tap into the benefits of slow thinking. He offers practical and enlightening insights into how choices are made in both our business and our personal lives—and how we can use different techniques to guard against the mental glitches that often get us into trouble. Winner of the National Academy of Sciences Best Book Award and the Los Angeles Times Book Prize and selected by The New York Times Book Review as one of the ten best books of 2011, Thinking, Fast and Slow is destined to be a classic.

22.0 In Stock
Thinking, Fast and Slow

Thinking, Fast and Slow

by Daniel Kahneman
Thinking, Fast and Slow

Thinking, Fast and Slow

by Daniel Kahneman

Paperback

$22.00 
  • SHIP THIS ITEM
    Ships in 1-2 days
  • PICK UP IN STORE

    Your local store may have stock of this item.

Related collections and offers


Overview

Major New York Times bestseller
Winner of the National Academy of Sciences Best Book Award in 2012
Selected by the New York Times Book Review as one of the ten best books of 2011
A Globe and Mail Best Books of the Year 2011 Title
One of The Economist's 2011 Books of the Year
One of The Wall Street Journal's Best Nonfiction Books of the Year 2011
2013 Presidential Medal of Freedom Recipient
Kahneman's work with Amos Tversky is the subject of Michael Lewis's The Undoing Project: A Friendship That Changed Our Minds

In the international bestseller, Thinking, Fast and Slow, Daniel Kahneman, the renowned psychologist and winner of the Nobel Prize in Economics, takes us on a groundbreaking tour of the mind and explains the two systems that drive the way we think. System 1 is fast, intuitive, and emotional; System 2 is slower, more deliberative, and more logical. The impact of overconfidence on corporate strategies, the difficulties of predicting what will make us happy in the future, the profound effect of cognitive biases on everything from playing the stock market to planning our next vacation—each of these can be understood only by knowing how the two systems shape our judgments and decisions.

Engaging the reader in a lively conversation about how we think, Kahneman reveals where we can and cannot trust our intuitions and how we can tap into the benefits of slow thinking. He offers practical and enlightening insights into how choices are made in both our business and our personal lives—and how we can use different techniques to guard against the mental glitches that often get us into trouble. Winner of the National Academy of Sciences Best Book Award and the Los Angeles Times Book Prize and selected by The New York Times Book Review as one of the ten best books of 2011, Thinking, Fast and Slow is destined to be a classic.


Product Details

ISBN-13: 9780374533557
Publisher: Farrar, Straus and Giroux
Publication date: 04/02/2013
Pages: 512
Sales rank: 44
Product dimensions: 5.40(w) x 8.20(h) x 1.50(d)

About the Author

About The Author
Daniel Kahneman is Eugene Higgins Professor of Psychology Emeritus at Princeton University and Professor of Psychology and Public Affairs Emeritus at Princeton's Woodrow Wilson School of Public and International Affairs. He received the 2002 Nobel Prize in Economic Sciences for his pioneering work with Amos Tversky on decision-making.

Read an Excerpt

THINKING, FAST AND SLOW 

 

Introduction

Every author, I suppose, has in mind a setting in which readers of his or her work could benefit from having read it. Mine is the proverbial office water-cooler, where opinions are shared and gossip is exchanged. I hope to enrich the vocabulary that people use when they talk about the judgments and choices of others, the company’s new policies, or a colleague’s investment decisions. Why be concerned with gossip? Because it is much easier, as well as far more enjoyable, to identify and label the mistakes of others than to recognize our own. Questioning what we believe and want is difficult at the best of times, and especially difficult when we most need to do it, but we can benefit from the informed opinions of others. Many of us spontaneously anticipate how friends and colleagues will evaluate our choices; the quality and content of these anticipated judgments therefore matters. The expectation of intelligent gossip is a powerful motive for serious self-criticism, more powerful than New Year resolutions to improve one’s decision making at work and at home.

To be a good diagnostician, a physician needs to acquire a large set of labels for diseases, each of which binds an idea of the illness and its symptoms, possible antecedents and causes, possible developments and consequences, and possible interventions to cure or mitigate the illness. Learning medicine consists in part of learning the language of medicine. A deeper understanding of judgments and choices also requires a richer vocabulary than is available in everyday language. The hope for informed gossip is that there are distinctive patterns in the errors people make. Systematic errors are known as biases: they recur predictably in particular circumstances. When the handsome and confident speaker bounds to the stage, for example, you can anticipate that the audience will judge his comments more favorably than he deserves. The availability of a diagnostic label for this bias—the halo effect—makes it easier to anticipate, recognize, and understand.

When you are asked what you are thinking about, you can normally answer. You believe you know what goes on in your mind, which often consists of one conscious thought leading in an orderly way to another. But that is not the only way the mind works, or indeed is that the typical way. Most impressions and thoughts arise in your conscious experience without your knowing how they got there. You cannot trace how you came to the belief that there is a lamp on the desk in front of you, or how you detected a hint of irritation in your spouse’s voice on the telephone, or how you managed to avoid a threat on the road before you became consciously aware of it. The mental work that produces impressions, intuitions, and many decisions goes on in silence in our mind.

Much of the discussion of this book is about biases of intuition. However, the focus on error does not denigrate human intelligence, any more than the attention to diseases in medical texts denies good health. Most of us are healthy most of the time, and most of our judgments and actions are appropriate most of the time. As we navigate our lives, we normally allow ourselves to be guided by impressions and feelings, and the confidence we have in our intuitive beliefs and preferences is usually justified. But not always. We are often confident when we are wrong, and an objective observer is more likely to detect our errors than we are.

So this is my aim for water-cooler conversations: improve the ability to identify and understand errors of judgment and choice, in others and eventually in ourselves, by providing a richer and more precise language to discuss them. In at least some cases, an accurate diagnosis may suggest an intervention to limit the damage that bad judgments and choices often cause.

 

Origins

This book presents my current understanding of judgment and decision making, which has been shaped by psychological discoveries of recent decades. However, I trace the central ideas to the lucky day in 1969 when I asked a colleague to speak as a guest to a seminar I was teaching in the Department of Psychology at the Hebrew University of Jerusalem. Amos Tversky was considered a rising star in the field of decision research—indeed, in anything he did—so I knew we would have an interesting time. Many people who knew Amos thought he was the most intelligent person they had ever met. He was brilliant, voluble, and charismatic. He was also blessed with a perfect memory for jokes and with an exceptional ability to use them to make a point. There was never a dull moment when Amos was around. He was then thirty-two; I was thirty-five.

Amos told the class about an ongoing program of research at the University of Michigan that sought to answer this question: Are people good intuitive statisticians? We already knew that people are good intuitive grammarians: at age four a child effortlessly conforms to the rules of grammar as she speaks, although she has no idea that such rules exist. Do people have a similar intuitive feel for the basic principles of statistics? Amos reported that the answer was a qualified yes. We had a lively debate in the seminar and ultimately concluded that a qualified no was a better answer.

Amos and I enjoyed the exchange and concluded that intuitive statistics was an interesting topic and that it would be fun to explore it together. That Friday we met for lunch at Café Rimon, the favorite hangout of bohemians and professors in Jerusalem, and planned a study of the statistical intuitions of sophisticated researchers. We had concluded in the seminar that our own intuitions were deficient. In spite of years of teaching and using statistics, we had not developed an intuitive sense of the reliability of statistical results observed in small samples. Our subjective judgments were biased: we were far too willing to believe research findings based on inadequate evidence, and prone to collect too few observations in our own research. The goal of our study was to examine whether other researchers suffered from the same affliction.

We prepared a survey that included realistic scenarios of statistical issues that arise in research. Amos collected the responses of a group of expert participants in a meeting of the Society of Mathematical Psychology, including the authors of two statistical textbooks. As expected, we found that our expert colleagues, like us, greatly exaggerated the likelihood that the original result of an experiment would be successfully replicated even with a small sample. They also gave very poor advice to a fictitious graduate student about the number of observations she needed to collect. Even statisticians were not good intuitive statisticians.

While writing the article that reported these findings Amos and I discovered that we enjoyed working together. Amos was always very funny and in his presence I became funny as well, so we could spend hours of solid work in continuous amusement. The pleasure we found in working together made us exceptionally patient; it is much easier to strive for perfection when you are never bored. Perhaps most important, we checked our critical weapons at the door. Both Amos and I were critical and argumentative, he even more than I, but during the years of our collaboration neither of us ever rejected out of hand anything the other had said. Indeed, one of the great joys I found in the collaboration was that Amos frequently saw the point of my vague ideas much more clearly than I did. Amos was the more logical thinker, with an orientation to theory and an unfailing sense of direction. I was more intuitive, and rooted in the psychology of perception, from which we borrowed many ideas. We were sufficiently similar to understand one another other easily, and sufficiently different to surprise each other. We developed a routine in which we spent much of our working days together, often on long walks. For the next fourteen years our collaboration was the focus of our lives, and the work we did together during those years was the best either of us ever did.

We quickly adopted a practice that we maintained for many years. Our research was a conversation, in which we invented questions and jointly examined our intuitive answers. Each question was a small experiment, and we carried out many experiments in a single day. We were not seriously looking for the correct answer to the statistical questions we posed. Our aim was to identify and analyze the intuitive answer, the first one that came to mind, the one we were tempted to make even when we knew it to be wrong. We believed—correctly as it happened—that any intuition that the two of us shared would be shared by other people as well, and that it would be easy to demonstrate its effects on judgments.

We once discovered with great delight that we had identical silly ideas about the future professions of several toddlers we both knew: we could identify the argumentative three-year-old lawyer, the nerdy professor, the empathetic and mildly intrusive psychotherapist. Of course these predictions were absurd, but we still found them appealing. It was also clear that our intuitions were governed by the resemblance of each child to the cultural stereotype of a profession. The amusing exercise helped us develop a theory that was emerging in our minds at the time, about the role of resemblance in predictions. We went on to test and elaborate that theory in dozens of experiments, as in the following example.

As you consider the next question, please assume that Steve was selected at random from a representative sample:

An individual has been described by a neighbor as follows: “Steve is very shy and withdrawn, invariably helpful but with little interest in people, or in the world of reality. A meek and tidy soul, he has a need for order and structure, and a passion for detail.” Is Steve more likely to be a librarian or a farmer?

The resemblance of Steve’s personality to a stereotypical librarian strikes everyone immediately, but equally relevant statistical considerations are almost always ignored. Did it occur to you that there are more than twenty male farmers for each male librarian in the United States? Because there are so many more farmers, it is almost certain that more “meek and tidy souls” will be found on tractors than at library information counters. However, we found that participants in our experiments ignored the relevant statistical facts and relied exclusively on resemblance. We proposed that they used resemblance as a simplifying heuristic (roughly, a rule of thumb) to make a difficult judgment. The reliance on the heuristic caused predictable biases (systematic errors) in their predictions.

On another occasion, Amos and I wondered about the rate of divorces among professors in our university. We noticed that the question triggered a search of memory for cases of divorced professors we knew or knew about, and that we judged the size of categories by the ease with which instances came to mind. We called this reliance on the ease of memory search the availability heuristic. In one of our studies we asked participants to answer a simple question about words in a typical English text:

 

Consider the letter K.

Is K more likely to appear as the first letter in a word OR as the third letter in a word?

As any Scrabble player knows, it is much easier to come up with words that begin with a particular letter than to find words that have the same letter in the third position. This statement is true for every letter of the alphabet. We therefore expected respondents to exaggerate the frequency of letters appearing in the first position—even those letters (such as K, L, N, R, V) which in fact occur more frequently in the third position. Here again, the reliance on a heuristic produces a predictable bias in judgments. For example, I recently came to doubt my long-held impression that adultery is more common among politicians than among physicians or lawyers. I had even come up with explanations for that “fact,” including the aphrodisiac effect of power and the temptations of life away from home. I eventually realized that the transgressions of politicians are much more likely to be reported than the transgressions of lawyers and doctors. My intuitive impression could be due entirely to journalists’ choices of topics and to my reliance on the availability heuristic.

Amos and I spent several years studying and documenting biases of intuitive thinking in various tasks—assigning probabilities to events, forecasting the future, assessing hypotheses, and estimating frequencies. In the fifth year of our collaboration we presented our main findings in Science magazine, a publication read by scholars in many disciplines. The article (which is reproduced in full at the end of this book) was titled “Judgment under Uncertainty: Heuristics and Biases.” It described the simplifying shortcuts of intuitive thinking, and explained some 20 biases as manifestations of these heuristics—and also as demonstrations of the role of heuristics in judgment.

Historians of science have often noted that at any given time scholars in a field of research tend to share basic assumptions about their subject. Social scientists are no exception; they rely on a view of human nature that provides the background of most discussions of specific behaviors, but is rarely questioned. Social scientists in the 1970s broadly accepted two ideas about human nature. First, people are generally rational, and their thinking is normally sound. Second, emotions such as fear, affection, and hatred explain most of the occasions on which people depart from rationality. Our article challenged both assumptions without discussing them directly. We documented systematic errors in the thinking of normal people, and we traced these errors to the design of the machinery of cognition rather than to the corruption of thought by emotion.

Our article attracted much more attention than we had expected, and it remains one of the most highly cited works in social science (more than three hundred scholarly articles referred to it in 2010). Scholars in other disciplines found it useful, and the ideas of heuristics and biases have been used productively in many fields, including medical diagnosis, legal judgment, intelligence analysis, philosophy, finance, statistics, and military strategy.

For example, students of policy have noted that the availability heuristic helps explain why some issues are highly salient in the public’s mind while others are neglected. People tend to assess the relative importance of issues by the ease with which they are retrieved from memory—and this is largely determined by the extent of coverage in the media. Frequently mentioned topics populate the mind even as others slip away from awareness. In turn, what the media choose to report corresponds to their view of what is currently on the public’s mind. It is no accident that authoritarian regimes exert substantial pressure on independent media. Because public interest is most easily aroused by dramatic events and by celebrities, media feeding frenzies are common. For several weeks after Michael Jackson’s death, for example, it was virtually impossible to find a television channel broadcasting on another topic. In contrast, there is little coverage of critical but unexciting issues that provide less drama, such as declining educational standards or overinvestment of medical resources in the last year of life. (As I write this, I notice that my choice of “little-covered” examples was guided by availability. The topics I chose as examples are mentioned quite often; equally important issues that are less available did not come to my mind.)

We did not fully realize it at the time, but a key reason for the broad appeal of “heuristics and biases” outside psychology was an incidental feature of our work: we almost always included in our articles the full text of the questions we had asked ourselves and our respondents. These questions served as demonstrations for the reader, allowing him to recognize how his own thinking was tripped up by cognitive biases. I hope you had such an experience as you read the question about Steve the librarian, which was intended to help you appreciate the power of resemblance as a cue to probability, and to see how easy it is to ignore relevant statistical facts.

The use of demonstrations provided scholars from diverse disciplines—notably philosophers and economists—an unusual opportunity to observe possible flaws in their own thinking. Having seen themselves fail, they became more likely to question the dogmatic assumption, prevalent at the time, that the human mind is rational and logical. The choice of method was crucial: if we had only reported results of conventional experiments, the article would have been less noteworthy and less memorable. Furthermore, skeptical readers would have distanced themselves from the results by attributing judgment errors to the familiar fecklessness of undergraduates, the typical participants in psychological studies. Of course, we did not choose demonstrations over standard experiments because we wanted to influence philosophers and economists. We preferred demonstrations because they were more fun, and were lucky in our choice of method as well as in many other ways. A recurrent theme of this book is that luck plays a large role in every story of success; it is almost always easy to identify a small change in the story that would have turned a remarkable achievement into a mediocre outcome. Our story was no exception.

The reaction to our work was not uniformly positive. In particular, our focus on biases was criticized by numerous authors as suggesting an unfairly negative view of the mind. As expected in normal science, some investigators refined our ideas and others offered plausible alternatives. By and large, though, the idea that our minds are susceptible to systematic errors is now generally accepted. Our research on judgment had far more effect on social science than we thought possible when we were working on it.

Immediately after completing our review of judgment we switched our attention to decision making under uncertainty. Our goal was to develop a psychological theory of how people make decisions about simple gambles. For example: Would you accept a bet on the toss of a coin where you win $130 if the coin shows heads and lose $100 if it shows tails? These elementary choices have long been used to examine broad questions about decision making, such as the relative weight that people assign to sure things and to uncertain outcomes. Our method did not change: we spent many days making up choice problems, and examining whether our intuitive preferences conformed to the logic of choice. Here again, as in judgment, we observed systematic biases in our decisions, intuitive preferences that consistently violated the rules of rational choice. Five years after the Science article, we published “Prospect Theory: An Analysis of Decision Under Risk,” a theory of choice that is by some counts even more influential than our work on judgment, and is one of the foundations of behavioral economics.

Until geographical separation made it too difficult to go on, Amos and I enjoyed the extraordinary good fortune of a shared mind that was superior to our individual minds and of a relationship that made our work fun as well as productive. Our collaboration on judgment and decision making was the reason for the Nobel Prize that I received in 2002, which Amos would have shared had he not died, aged fifty-nine, in 1996.

Where We Are Now

This book is not intended as an exposition of the early research that Amos and I conducted together, a task that has been ably carried out by many authors over the years. My main aim here is to present a view of how the mind works that draws on recent developments in cognitive and social psychology. One of the more important of these developments is that we now understand the marvels as well as the flaws of intuitive thought.

Amos and I did not address accurate intuitions, beyond the casual statement that judgment heuristics “are quite useful, but sometimes lead to severe and systematic errors.” We focused on biases, both because we found them interesting in their own right and because they provided evidence for the heuristics of judgment. We did not ask ourselves whether all intuitive judgments under uncertainty are produced by the heuristics we studied; it is now clear that they are not. In particular, the accurate intuitions of experts are better explained by the effects of prolonged practice than by heuristics. We can now draw a richer and more balanced picture, in which skill and heuristics are alternative sources of intuitive judgments and choices.

The psychologist Gary Klein tells the story of a team of firefighters that entered a house in which the kitchen was on fire. Soon after the men started hosing down the kitchen, the commander heard himself shout, “Let’s get out of here!” without realizing why. The floor collapsed almost immediately after the firefighters escaped. Only after the fact did the commander realize that the fire had been unusually quiet and that his ears had been unusually hot. Together, these impressions prompted what he called a “sixth sense of danger.” He had no idea what was wrong, but he knew something was wrong. It turned out that the heart of the fire had not been in the kitchen, but in the basement beneath where the men had stood.

We have all heard such stories of expert intuition: the chess master who walks past a street game and announces “White mates in three” without stopping, or the physician who makes a complex diagnosis after a single glance at a patient. Expert intuition strikes us as magical, but it is not. Indeed, each of us performs feats of intuitive expertise many times each day. Most of us are pitch-perfect in detecting anger in the first word of a telephone call, recognize as we enter a room that we were the subject of the conversation, and quickly react to subtle signs that the driver of the car in the next lane is dangerous. Our everyday intuitive abilities are no less marvelous than the striking insights of an experienced firefighter or physician—only more common.  

The psychology of accurate intuition involves no magic. Perhaps the best short statement of it is due to the great Herbert Simon, who studied chess masters and showed that after thousands of hours of practice they come to see the pieces on the board differently from the rest of us. You can feel Simon’s impatience with the mythologizing of expert intuition when he writes: “The situation has provided a cue: this cue has given the expert access to information stored in memory, and the information provides the answer. Intuition is nothing more and nothing less than recognition.”

We are not surprised when a two-year-old looks at an animal and correctly says “doggie!” because we are used to the miracle of children learning to recognize and name things. Simon’s point is that the miracles of expert intuition have the same character. Valid intuitions develop when experts have learned to recognize familiar elements in a new situation and to act in a manner that is appropriate to it. Good intuitive judgments come to mind with the same immediacy as “doggie!”

Unfortunately, professionals’ intuitions do not all arise from true expertise. Many years ago I visited the chief investment officer of a large financial firm, who told me that he had just invested some tens of millions of dollars in the stock of Ford Motor Company. When asked how he had made that decision, he replied that he had recently attended an automobile show and had been impressed. “Boy, do they know how to make a car!” was his explanation. He made it very clear that he trusted his gut feeling and was satisfied with himself and with his decision. I found it remarkable that he had apparently not considered the one question that an economist would call relevant: Is Ford stock currently underpriced? Instead, he had listened to his intuition; he liked the cars, he liked the company, and he liked the idea of owning its stock. From what we know about the accuracy of stock picking, it is reasonable to believe that he did not know what he was doing.

The specific heuristics that Amos and I studied provide little help in understanding how the executive came to invest in Ford stock, but a broader conception of heuristics now exists, which offers a good account. An important advance is that emotion now looms much larger in our understanding of intuitive judgments and choices than it did in the past. The executive’s decision would today be described as an example of the affect heuristic, where judgments and decisions are guided directly by feelings of liking and disliking, with little deliberation or reasoning.

When confronted with a problem—choosing a chess move or deciding whether to invest in a stock—the machinery of intuitive thought does the best it can. If the individual has relevant expertise, she will recognize the situation, and the intuitive solution that comes to her mind is likely to be correct. This is what happens when a chess master looks at a complex position: the few moves that immediately occur to him are all strong. When the question is difficult and a skilled solution is not available, intuition still has a shot: an answer may come to mind quickly—but it is not an answer to the original question. The question that the executive faced (should I invest in Ford stock?) was difficult, but the answer to an easier and related question (do I like Ford cars?) came readily to his mind and determined his choice. This is the essence of intuitive heuristics: when faced with a difficult question we quite often answer an easier one instead, usually without noticing the substitution.

The spontaneous search for an intuitive solution sometimes fails—neither an expert solution nor a heuristic answer comes to mind. In such cases we often find ourselves switching to a slower, more deliberate and effortful form of thinking. This is the slow thinking of the title. Fast thinking includes both variants of intuitive thought —the expert and the heuristic—as well as the entirely automatic mental activities of perception and memory, the operations that enable you to know there is a lamp on your desk, or retrieve the name of the capital of Russia.

The distinction between fast and slow thinking has been explored by many psychologists over the last twenty-five years. For reasons that I explain more fully in the next chapter, I will describe mental life by the metaphor of two agents, called System 1 and System 2, which respectively produce fast and slow thinking. I speak of the features of intuitive and deliberate thought as if they were traits and dispositions of two characters in your mind. In the picture that emerges from recent research, the intuitive System1 is more influential than your experience tells you, and it is the secret author of many of the choices and judgments that you make. Most of this book is about the workings of System 1 and the mutual influences between it and System 2.

What Comes Next

The book is divided into five parts. Part 1 presents the basic elements of a two-systems approach to judgment and choice. It elaborates the distinction between the automatic operations of System 1 and the controlled operations of System 2 and shows how associative memory, the core of System 1, continually constructs a coherent interpretation of what is going on in our world at any instant. I attempt to give a sense of the complexity and richness of the automatic and often unconscious processes that underlie intuitive thinking, and of how these automatic processes explain the heuristics of judgment. A goal is to introduce a language for thinking and talking about the mind.

Part 2 updates the study of judgment heuristics and explores a major puzzle: Why is it so difficult for us to think statistically? We easily think associatively, we think metaphorically, we think causally, but statistics requires thinking about many things at once, which is something that System 1 is not designed to do.

The difficulties of statistical thinking contribute to the main theme of Part 3, which describes a puzzling limitation of our mind: our excessive confidence in what we believe we know, and our apparent inability to acknowledge the full extent of our ignorance and the uncertainty of the world we live in. We are prone to overestimate how much we understand about the world and to underestimate the role of chance in events. Overconfidence is fed by the illusory certainty of hindsight. My views on this topic have been influenced by Nassim Taleb, the author of The Black Swan. I hope for watercooler conversations that intelligently explore the lessons that can be learned from the past, while resisting the lure of hindsight and the illusion of certainty.

The focus of Part 4 is a conversation with the discipline of economics on the nature of decision making and on the realism of the assumption that economic agents are rational. This section of the book provides a current view, informed by the two-system model, of the key concepts of prospect theory, the model of choice that Amos and I published in 1979. Subsequent chapters address several ways human choices deviate from the rules of rationality. I deal with the unfortunate tendency to treat decision problems in isolation, and with framing effects, where decisions are shaped by inconsequential features of choice problems. These observations, which are readily explained by the features of System 1, present a deep challenge to the rationality assumption favored in standard economics.

Part 5 describes recent research that has introduced a distinction between two selves, the experiencing self and the remembering self, which do not have the same interests. For example, we can expose people to two painful experiences. One of these experiences is strictly worse than the other, because it is longer. But the automatic formation of memories—a feature of System 1—has its rules, which we can exploit so that the worse episode leaves a better memory. When people later choose which episode to repeat, they are, naturally, guided by their remembering self and expose themselves (their experiencing self) to unnecessary pain. The distinction between two selves is applied to the measurement of well-being, where we find again that what makes the experiencing self happy is not quite the same as what satisfies the remembering self. How two selves within a single body can pursue happiness raises some difficult questions, both for individuals and for societies that view the well-being of the population as a policy objective.

A concluding chapter explores, in reverse order, the implications of three distinctions drawn in the book: between the experiencing and remembering selves, between the conception of agent in classical economics and in behavioral economics (which borrows from psychology), and between the automatic System 1 and the effortful System 2. I return to the virtues of educating gossip and to what organizations might do to improve the quality of judgments and decisions that are made on their behalf.

Two articles I wrote with Amos are reproduced as appendixes to the book. The first is the review of judgment under uncertainty that I described earlier. The second, published in 1984, summarizes prospect theory as well as our studies of framing effects. The articles present the contributions that were cited by the Nobel committee—and you may be surprised by how simple they are. Reading them will give you a sense of how much we knew a long time ago, and also of how much we have learned in recent decades.

Table of Contents

Introduction 3

Part I Two Systems

1 The Characters of the Story 19

2 Attention and Effort 31

3 The Lazy Controller 39

4 The Associative Machine 50

5 Cognitive Ease 59

6 Norms, Surprises, and Causes 71

7 A Machine for Jumping to Conclusions 79

8 How Judgments Happen 89

9 Answering an Easier Question 97

Part II Heuristics and Biases

10 The Law of Small Numbers 109

11 Anchors 119

12 The Science of Availability 129

13 Availability, Emotion, and Risk 137

14 Tom W's Specialty 146

15 Linda: Less is More 156

16 Causes Trump Statistics 166

17 Regression to the Mean 175

18 Taming Intuitive Predictions 185

Part III Overconfidence

19 The Illusion of Understanding 199

20 The Illusion of Validity 209

21 Intuitions vs. Formulas 222

22 Expert intuition: when can we trust it? 234

23 The Outside View 245

24 The Engine of Capitalism 255

Part IV Choices

25 Bernoulli's Errors 269

26 Prospect Theory 278

27 The Endowment Effect 289

28 Bad Events 300

29 The Fourfold Pattern 310

30 Rare Events 320

31 Risk Policies 334

32 Keeping Score 342

33 Reversals 353

34 Frames and Reality 363

Part V Two Selves

35 Two Selves 377

36 Life as a Story 386

37 Experienced Well-Being 391

38 Thinking about Life 398

Conclusions 408

Appendix A Judgment Under Uncertainty 419

Appendix B Choices, Values, and Frames 433

Notes 449

Acknowledgments 483

Index 485

Reading Group Guide

* A major New York Times bestseller

* Selected by The New York Times Book Review as one of the ten best books of 2011

* Winner of the National Academy of Sciences Best Book Award in 2012

In his groundbreaking tour of the mind's machinery, Daniel Kahneman, a renowned psychologist and winner of the Nobel Prize in Economics, presents us with two systems that drive the way we think. System 1 is fast, intuitive, and emotional; System 2 is slower, more deliberative, and more logical. The impact of loss aversion and overconfidence on corporate strategies, the difficulties of predicting what will make us happy in the future, the profound effect of cognitive biases on everything from playing the stock market to planning our next vacation—each of these can be understood only by knowing how the two systems shape our judgments and decisions.
Kahneman's ideas have had a profound and widely regarded impact on many fields, including economics, medicine, and politics. Engaging the reader in a lively conversation about how we think, Kahneman reveals where we can and cannot trust our intuitions and how we can tap into the benefits of slow thinking. Rich with practical and enlightening insights into how choices are made in both business and our personal lives, Thinking, Fast and Slow will transform the way you think about thinking.
These questions and discussion topics are designed to enhance your reading of Daniel Kahneman's Thinking, Fast and Slow. We hope they will enliven your experience as your explore this rich and fascinating account of how we think.

1. At the opening of Thinking, Fast and Slow, Kahneman discusses the "proverbial office water- cooler" as the ideal setting in which readers could use knowledge gained from his book. Why do you think Kahneman makes a point of mentioning gossip so early on? How do you see his claims about the connection between gossip and better decision making playing out in the book and in your own lives?

2. Everyone has his or her favorite "cognitive biases" described in the book. Discuss what you felt were the most surprising, resonant, suggestive, or memorable of the biases, fallacies, and illusions that Kahneman explores. What captured your imagination about these particular ones?

3. On the flip side, what did you think were the least persuasive parts of Kahneman's arguments? What conclusions did you doubt or disbelieve? Were there experiments whose results you questioned?

4. Even though Kahneman discusses the "optimistic bias," he also says, "If you were allowed one wish for your child, seriously consider wishing him or her optimism." using this as a jumping-off point, discuss what is useful about our cognitive biases. What are the "upsides" to the irrational thinking we are all sometimes guilty of? Would we be happier if we were free of all our biases?

5. The Atlantic has called Kahneman the anti–Malcolm Gladwell. Do you agree? How does Thinking, Fast and Slow compare in argument and approach to other books you may have read about human rationality and behavior?

6. Kahneman writes: "I have made much more progress in recognizing the errors of others than my own." Based on your reading of Thinking, Fast and Slow, what practices, behaviors, or activities do you think we could cultivate to strengthen System 2's effortful thinking over System's 1's automatic responses? How optimistic are you about the results over time? Which of the biases do you think would be the most difficult to uproot?

7. "Self-help" books are traditionally thought to empower us to take greater control over our own lives, but Thinking, Fast and Slow also calls into question the limits of our rationality. Do you think Thinking, Fast and Slow works as a piece of self-help literature? After reading it, do you feel more or less "in control"?

8. As Kahneman explains, systematic biases play a substantial role not only in our own lives, but in the functioning of groups, businesses, and even societies. Discuss examples from politics, culture, or current affairs that you feel demonstrate certain biases.

9. Kahneman's arguments have been applied widely throughout a variety of industries and disciplines, and many of the cognitive biases (including framing, availability, anchors, and the planning fallacy) have serious implications for professional practice. Can you observe biases—and the exploitation of those biases—in your professional environment? Would you take advantage of the cognitive biases Kahneman describes in your professional life?

10. Kahneman devotes all of Part 3 to examining "overconfidence." Why does this subject deserve its own section? In what ways is it a concern throughout the book?

11. In Part 4, Kahneman discusses the ways we evaluate losses and gains, and concludes that human beings tend to be loss averse. Discuss the examples of these behaviors that he explores, and consider how much your own thinking about risk and reward has changed as a result. Would you make bets now that you wouldn't have made before, following his advice to "think like a trader"? Does knowing about the "possibility effect," for instance, change your attitude toward playing the lottery or mitigate fears about certain dangerous but rare occurrences like accidents, terrorism, and disease?

12. Kahneman writes in the introduction that "[a] recurrent theme of this book is that luck plays a large role in every story of success." Discuss the role of luck in Thinking, Fast and Slow, considering especially how it relates to Kahneman's treatment of the world of business and finance. Do you think Kahneman's strong emphasis on luck in stories of success and good fortune is justified?

13. Kahneman argues that leaving out prospect theory from most introductory economics textbooks is justified since "[t]he basic concepts of economics are essential intellectual tools, which are not easy to grasp even with simplified and unrealistic assumptions about the nature of the economic agents who interact in markets. Raising questions about these assumptions even as they are introduced would be confusing, and perhaps demoralizing." When, if ever, is the appropriate time to teach students about System 1 and 2? Do you think students should learn about the flaws in the rational agent model early on in their educations?

14. On the last page of his book, Kahneman writes that the "remarkable absence of systematic training for the essential skill of conducting efficient meetings" is one way that decision making could be improved in an organization. using Kahneman's research, discuss other ways that you think efficiency and effectiveness might be improved at organizations you deal with regularly.

15. Discuss Thinking, Fast and Slow as an intellectual memoir. What do you think made Kahneman's collaborations—especially with Amos Tversky—successful? What working habits or professional practices of his struck you?

16. Kahneman describes his working relationship with Gary Klein and calls it his "most satisfying and productive adversarial collaboration." Discuss the benefits and risks of "adversarial collaboration" using examples from your own life and work.

17. Discuss the political dimension of this book, considering that Kahneman argues that humans often make choices that go against their self-interest. What implications do you think these claims and, later, Kahneman's arguments about happiness and well-being have for policy and governance? Should human beings be "nudged" to make the "right" choices?

18. Consider the ethical dilemma of the colonoscopy experiments: Should doctors focus on limiting real-time pain or the memories of pain? In other words, how should we weigh the importance of our remembering and experiencing selves? Also, discuss Kahneman's example of the painful operation that you will forget about later. Do you share his belief that "I feel pity for my suffering self but not more than I would feel for a stranger in pain"?

19. Kahneman writes, "In normal circumstances . . . we draw pleasure and pain from what is happening at the moment, if we attend to it," and notes that French and American women spend about the same amount of time eating, but French women seem to attend to it more keenly and therefore enjoy it more. How does the issue of attention relate to Kahneman's arguments about Systems 1 and 2, as well as his later arguments about happiness and well-being? Discuss the implications of these arguments both for the individual and for society.

20. Philosophers have long debated what it means to lead a "good life." In what ways has reading Thinking, Fast and Slow affected your understanding of what this might mean? In your view, how important is happiness (however we understand that term) to a "good life"? How important is behaving rationally? Are they related?

21. What questions did Thinking, Fast and Slow leave you with? If you were a scientist working in this field, what areas would you want to study further?

From the B&N Reads Blog

Customer Reviews

Explore More Items