Dealing with irrational decisions in litigation is frustrating. We’ve all been there before: A judge refuses to modify a scheduling order; a jury awards excessive damages to a sympathetic plaintiff; or a client refuses to settle a hopeless case. People do not always act rationally. Nor do they always make optimal decisions. We are all influenced by subconscious biases and intuitions that mold our answers to even the most important questions. Recognizing situations that prey on these biases and intuitions is the first step in predicting—and minimizing the effect of—“irrational” behavior.
Thinking, Fast and Slow
While traveling for work (and while taking a break from work while traveling), I stumbled upon Daniel Kahneman’s book Thinking, Fast and Slow. Kahneman is a Nobel Prize-winning psychologist and pioneer in the field of behavioral economics. At its core, his research involves the study of decision-making. How do people make decisions? What factors influence our decisions? As explained by Kahneman, “making decisions is like speaking prose—people do it all the time, knowingly or unknowingly.” Thinking, Fast and Slow is a compilation of Kahneman’s research colored with anecdotes, studies, and practical analysis. It is an enlightening guide into how and why people make seemingly irrational decisions, and its concepts are directly applicable to issues raised daily in litigation.
The Two Systems
Kahneman separates decision-making into two systems: System 1 and System 2. “System 1 operates automatically and quickly, with little or no effort and no sense of voluntary control.” It is the unconscious process that we use for everyday decisions, including interpreting facial expressions, calculating simple math equations, and detecting hostility in a voice. System 2 involves more complex and deliberate processes, such as focusing on the voice of a particular person in a crowded room, monitoring the appropriateness of your behavior in a social situation, and checking the validity of a legal argument. “In all these situations you must pay attention, and you will perform less well, or not at all, if you are not ready or if your attention is directed inappropriately.”
On occasion, System 1 makes mistakes because it rapidly makes decisions using mental shortcuts such as associations, familiarity, or substitutions (i.e., “heuristics”). For example, if you are asked whether you think a company is a good investment, you may automatically answer the question based on whether you like or dislike their products, not based on the stock price. System 1 automatically answers a harder question (Is the company a good investment?) with the answer to an easier question (Do I like the company’s products?).
System 1 impacts the life of a litigator on a daily basis. You may find yourself automatically associating an old case with a new case. You may also find yourself substituting your decision-making from the old case to the new case. Judges may be susceptible to this as well when dealing with familiar issues. Making these associations is not wrong, but it is important to recognize that each case is unique. This is where System 2 comes into play.
One of the most important functions of System 2 is to confirm or correct the “automatic” decisions made by System 1. We must activate System 2 to distinguish old cases from new cases and correct unwarranted substitutions. The activation of System 2, however, requires our attention and mental effort. When we are fatigued, distracted, or just lazy, we default to a System 1 response. This process may lead to mistakes or irrational behavior.
The Hungry Judge
The applicability of the two systems on a litigator was apparent from one of the first anecdotes in the book. Kahneman described a research study conducted on eight parole judges in Israel. The judges spent entire days reviewing applications for parole. The cases were presented in random order. On average, a judge spent six minutes on a case before reaching a parole decision. The “default” decision was to deny parole; only 35 percent of parole requests were granted.
The troubling part of the study is when decisions to grant parole were made. The judges were given three food breaks throughout the day. Immediately after each of the breaks, approximately 65 percent of the parole requests were granted. During the two hours after, the rate of parole grants steadily dropped until almost no parole requests were granted leading up to the next meal break. In sum, whether parole was granted was heavily dependent on the time of day that the petition was reviewed, not the merits of the case.
Kahneman explains: “The possible account of the data provides bad news: tired and hungry judges tend to fall back on the easier default position of denying requests for parole. Both fatigue and hunger probably play a role.” In other words, when System 2 is too fatigued to check the decision made by System 1, decision-making can be irrational. The practical application of this study is obvious. We cannot ensure that a judge will review our motion right after lunch—that
seemingly trivial request to modify the scheduling order could have just been a System 1 “default denial.” We can, however, control the schedule of important calls or client meetings. We can also take breaks during depositions to ensure that we (or our witnesses) are firing on all cylinders. Recognizing the impact that fatigue—or “ego depletion” as it is referred to by Kahneman—plays in our daily life can help us understand, and potentially cure, irrational decision-making.
Another concept with direct legal application is the anchoring effect. “It occurs when people consider a particular value for an unknown quantity before estimating the quantity.” The anchor provides System 1 with a reference point that influences a person’s estimate. The listing price of a house is a prime example of an anchor. The listing price provides an anchor that sets the perception of the value of the home. Without the listing price, a seller of a home would likely
receive many offers well below his or her expectations.
Kahneman explained the powerful effect of anchoring in the context of marketing. A sales promotion was run on canned soup in a supermarket in Idaho. Shoppers purchased an average of seven cans of soup when the sign read: “Limit 12 Per Person.” Shoppers purchased an average of three-and-a-half cans of soup when the sign read: “No Limit Per Person.” The anchor of a “12 can limit” manipulated System 1 and unknowingly forced shoppers to buy more soup than when there was no limit imposed.
Litigators experience the anchoring effect in a variety of settings, most obviously in settlement negotiations. You may also unwittingly anchor the value (or chance of success) of a case during an initial meeting with a client. And no matter what case developments occur after that meeting, your client’s satisfaction with the outcome will largely depend on your ability to meet (or exceed) that anchor.
Kahneman posited an interesting public policy issue related to anchoring: the effect of damages caps in personal injury cases. A damages cap would certainly eliminate large awards. But as illustrated by the soup can example, the cap (or anchor) may dramatically increase awards in cases that might not have ever approached the cap. A practical way around this issue, and one adopted by many jurisdictions, is to avoid instructing the jury that a damages cap exists. In light of the anchoring effect, this approach seems warranted.
Another useful litigation concept detailed by Kahneman is framing. Every argument we make, whether in a written motion or orally at trial, presents an opportunity to frame the issue. Logically equivalent statements, however, can evoke different “automatic” reactions (or decisions) because System 1 is highly susceptible to the way that questions or issues are framed.
Kahneman described a study he carried out on colleagues at Harvard Medical School. Physicians were given statistics about outcomes of two treatment options for lung cancer: surgery and radiation. Half of the participants were given the surgery statistic framed in terms of the “survival” rate: “The one-month survival rate is 90 percent.” The other half were given the same statistic framed in terms of the “mortality” rate: “There is 10 percent mortality in the first month.” The two statements are identical; however, surgery was chosen by 84 percent of physicians when the statistic was framed as the survival rate, while only 50 percent of physicians chose surgery when the statistic was framed as the mortality rate.
According to Kahneman, the reason is that System 1 is highly susceptible to emotional frames: “mortality is bad, survival is good, and 90 percent survival sounds encouraging whereas 10 percent mortality is frightening.” In a litigation context, the study demonstrates that a witness’ answer can be heavily influenced by the way a question is framed. Emotionally charged questions can elicit irrational responses. We must be cognizant of this fact, not only when we are asking questions of a witness, but also—perhaps even more importantly—when our witness is being questioned by an opposing attorney.
Vivid probabilities are another example of framing. The statement that a vaccine carries a “.0001 percent risk of permanent disability” generates a different emotional response than the statement “one of every 10,000 children vaccinated will be permanently disabled.” While the statements express the same proportion, a jury might get lost in the decimal points of the first statement, but it is hard to dismiss the visceral image of a disabled child in the second statement.
Other Practical Concepts
Kahneman provides a number of other concepts that explain irrational behavior and that are applicable to the legal profession. The “fourfold pattern” of preferences describes decision-making in the context of risk. It explains why we settle cases, buy insurance, buy lottery tickets, and take ill-advised gambles. The “halo effect” explains why it is important to make good first impressions on judges, jurors, and clients. And “what you see is all there is” explains why themes and the presentation of evidence may matter more than the quality of the evidence presented.
We cannot always prevent irrational decisions in litigation. But by activating our System 2, and stopping to think about how and why certain decisions are made, we can limit their effect.