The Yorker
WHAT WAS I THINKING?The latest reasoning about our irrational ways.
by Elizabeth Kolbert; FEBRUARY 25, 2008
People make bad decisions, but they make them in systematic ways.
Acouple of months ago, I went on-line to order a book. The book had a list price of twenty-four dollars; Amazon was offering it for eighteen. I clicked to add it to my “shopping cart” and a message popped up on the screen. “Wait!” it admonished me. “Add $7.00 to your order to qualify for FREE Super Saver Shipping!” I was ordering the book for work; still, I hesitated. I thought about whether there were other books that I might need, or want. I couldn’t think of any, so I got up from my desk, went into the living room, and asked my nine-year-old twins. They wanted a Tintin book. Since they already own a large stack of Tintins, it was hard to find one that they didn’t have. They scrolled through the possibilities. After much discussion, they picked a three-in-one volume containing two adventures they had previously read. I clicked it into the shopping cart and checked out. By the time I was done, I had saved The New Yorker $3.99 in shipping charges. Meanwhile, I had cost myself $12.91.
Why do people do things like this? From the perspective of neoclassical economics, self-punishing decisions are difficult to explain. Rational calculators are supposed to consider their options, then pick the one that maximizes the benefit to them. Yet actual economic life, as opposed to the theoretical version, is full of miscalculations, from the gallon jar of mayonnaise purchased at spectacular savings to the billions of dollars Americans will spend this year to service their credit-card debt. The real mystery, it could be argued, isn’t why we make so many poor economic choices but why we persist in accepting economic theory.
In “Predictably Irrational: The Hidden Forces That Shape Our Decisions” (Harper; $25.95), Dan Ariely, a professor at M.I.T., offers a taxonomy of financial folly. His approach is empirical rather than historical or theoretical. In pursuit of his research, Ariely has served beer laced with vinegar, left plates full of dollar bills in dorm refrigerators, and asked undergraduates to fill out surveys while masturbating. He claims that his experiments, and others like them, reveal the underlying logic to our illogic. “Our irrational behaviors are neither random nor senseless-they are systematic,” he writes. “We all make the same types of mistakes over and over.” So attached are we to certain kinds of errors, he contends, that we are incapable even of recognizing them as errors. Offered FREE shipping, we take it, even when it costs us.
As an academic discipline, Ariely’s field-behavioral economics-is roughly twenty-five years old. It emerged largely in response to work done in the nineteen-seventies by the Israeli-American psychologists Amos Tversky and Daniel Kahneman. (Ariely, too, grew up in Israel.) When they examined how people deal with uncertainty, Tversky and Kahneman found that there were consistent biases to the responses, and that these biases could be traced to mental shortcuts, or what they called “heuristics.” Some of these heuristics were pretty obvious-people tend to make inferences from their own experiences, so if they’ve recently seen a traffic accident they will overestimate the danger of dying in a car crash-but others were more surprising, even downright wacky. For instance, Tversky and Kahneman asked subjects to estimate what proportion of African nations were members of the United Nations. They discovered that they could influence the subjects’ responses by spinning a wheel of fortune in front of them to generate a random number: when a big number turned up, the estimates suddenly swelled.
Though Tversky and Kahneman’s research had no direct bearing on economics, its implications for the field were disruptive. Can you really regard people as rational calculators if their decisions are influenced by random numbers? (In 2002, Kahneman was awarded a Nobel Prize-Tversky had died in 1996-for having “integrated insights from psychology into economics, thereby laying the foundation for a new field of research.”)
Over the years, Tversky and Kahneman’s initial discoveries have been confirmed and extended in dozens of experiments. In one example, Ariely and a colleague asked students at M.I.T.’s Sloan School of Management to write the last two digits of their Social Security number at the top of a piece of paper. They then told the students to record, on the same paper, whether they would be willing to pay that many dollars for a fancy bottle of wine, a not-so-fancy bottle of wine, a book, or a box of chocolates. Finally, the students were told to write down the maximum figure they would be willing to spend for each item. Once they had finished, Ariely asked them whether they thought that their Social Security numbers had had any influence on their bids. The students dismissed this idea, but when Ariely tabulated the results he found that they were kidding themselves. The students whose Social Security number ended with the lowest figures-00 to 19-were the lowest bidders. For all the items combined, they were willing to offer, on average, sixty-seven dollars. The students in the second-lowest group-20 to 39-were somewhat more free-spending, offering, on average, a hundred and two dollars. The pattern continued up to the highest group-80 to 99-whose members were willing to spend an average of a hundred and ninety-eight dollars, or three times as much as those in the lowest group, for the same items.
This effect is called “anchoring,” and, as Ariely points out, it punches a pretty big hole in microeconomics. When you walk into Starbucks, the prices on the board are supposed to have been determined by the supply of, say, Double Chocolaty Frappuccinos, on the one hand, and the demand for them, on the other. But what if the numbers on the board are influencing your sense of what a Double Chocolaty Frappuccino is worth? In that case, price is not being determined by the interplay of supply and demand; price is, in a sense, determining itself.
Another challenge to standard economic thinking arises from what has become known as the “endowment effect.” To probe this effect, Ariely, who earned one of his two Ph.D.s at Duke, exploited the school’s passion for basketball. Blue Devils fans who had just won tickets to a big game through a lottery were asked the minimum amount that they would accept in exchange for them. Fans who had failed to win tickets through the same lottery were asked the maximum amount that they would be willing to offer for them.
“From a rational perspective, both the ticket holders and the non-ticket holders should have thought of the game in exactly the same way,” Ariely observes. Thus, one might have expected that there would be opportunities for some of the lucky and some of the unlucky to strike deals. But whether or not a lottery entrant had been “endowed” with a ticket turned out to powerfully affect his or her sense of its value. One of the winners Ariely contacted, identified only as Joseph, said that he wouldn’t sell his ticket for any price. “Everyone has a price,” Ariely claims to have told him. O.K., Joseph responded, how about three grand? On average, the amount that winners were willing to accept for their tickets was twenty-four hundred dollars. On average, the amount that losers were willing to offer was only a hundred and seventy-five dollars. Out of a hundred fans, Ariely reports, not a single ticket holder would sell for a price that a non-ticket holder would pay.
Whatever else it accomplishes, “Predictably Irrational” demonstrates that behavioral economists are willing to experiment on just about anybody. One of the more compelling studies described in the book involved trick-or-treaters. A few Halloweens ago, Ariely laid in a supply of Hershey’s Kisses and two kinds of Snickers-regular two-ounce bars and one-ounce miniatures. When the first children came to his door, he handed each of them three Kisses, then offered to make a deal. If they wanted to, the kids could trade one Kiss for a mini-Snickers or two Kisses for a full-sized bar. Almost all of them took the deal and, proving their skills as sugar maximizers, opted for the two-Kiss trade. At some point, Ariely shifted the terms: kids could now trade one of their three Kisses for the larger bar or get a mini-Snickers without giving up anything. In terms of sheer chocolatiness, the trade for the larger bar was still by far the better deal. But, faced with the prospect of getting a mini-Snickers for nothing, the trick-or-treaters could no longer reckon properly. Most of them refused the trade, even though it cost them candy. Ariely speculates that behind the kids’ miscalculation was anxiety. As he puts it, “There’s no visible possibility of loss when we choose a FREE! item (it’s free).” Tellingly, when Ariely performed a similar experiment on adults, they made the same mistake. “If I were to distill one main lesson from the research described
in this book, it is that we are all pawns in a game whose forces we largely fail to comprehend,” he writes.
A few weeks ago, the Bureau of Economic Analysis released its figures for 2007. They showed that Americans had collectively amassed ten trillion one hundred and eighty-four billion dollars in disposable income and spent very nearly all of it-ten trillion one hundred and thirty-two billion dollars. This rate of spending was somewhat lower than the rate in 2006, when Americans spent all but thirty-nine billion dollars of their total disposable income.
According to standard economic theory, the U.S. savings rate also represents rational choice: Americans, having reviewed their options, have collectively resolved to spend virtually all the money that they have. According to behavioral economists, the low savings rate has a more immediate explanation: it proves-yet again-that people have trouble acting in their own best interests. It’s worth noting that Americans, even as they continue to spend, say that they should be putting more money away; one study of participants in 401(k) plans found that more than two-thirds believed their savings rate to be “too low.”
In the forthcoming “Nudge: Improving Decisions About Health, Wealth, and Happiness” (Yale; $25), Richard H. Thaler and Cass R. Sunstein follow behavioral economics out of the realm of experiment and into the realm of social policy. Thaler and Sunstein both teach at the University of Chicago, Thaler in the graduate school of business and Sunstein at the law school. They share with Ariely the belief that, faced with certain options, people will consistently make the wrong choice.Therefore, they argue, people should be offered options that work with, rather than against, their unreasoning tendencies. These foolish-proof choices they label “nudges.” (A “nudge,” they note with scholarly care, should not be confused with a “noodge.”)
A typical “nudge” is a scheme that Thaler and Sunstein call “Save More Tomorrow.” One of the reasons people have such a hard time putting money away, the authors say, is that they are loss-averse. They are pained by any reduction in their take-home pay-even when it’s going toward their own retirement. Under “Save More Tomorrow,” employees commit to contributing a greater proportion of their paychecks to their retirement over time, but the increases are scheduled to coincide with their annual raises, so their paychecks never shrink. (The “Save More Tomorrow” scheme was developed by Thaler and the U.C.L.A. economist Shlomo Benartzi, back in 1996, and has already been implemented by several thousand retirement plans.)
People aren’t just loss-averse; they are also effort-averse. They hate having to go to the benefits office, pick up a bunch of forms, fill them out, and bring them all the way back. As a consequence, many eligible employees fail to enroll in their companies’ retirement plans, or delay doing so for years. (This is the case, research has shown, even at companies where no employee contribution is required.) Thaler and Sunstein propose putting this sort of inertia to use by inverting the choice that’s presented. Instead of having to make the trip to the benefits office to opt in, employees should have to make that trip only if they want to opt out. The same basic argument holds whenever a so-called default option is provided. For instance, most states in the U.S. require that those who want to become organ donors register their consent; in this way, many potential donors are lost. An alternative-used, for example, in Austria-is to make consent the default option, and put the burden of registering on those who do not wish to be donors. (It has been estimated that if every state in the U.S. simply switched from an “explicit consent” to a “presumed consent” system several thousand lives would be saved each year.)
“Nudges” could also involve disclosure requirements. To discourage credit-card debt, for instance, Thaler and Sunstein recommend that cardholders receive annual statements detailing how much they have already squandered in late fees and interest. To encourage energy conservation, they propose that new cars come with stickers showing how many dollars’ worth of gasoline they are likely to burn through in five years of driving.
Many of the suggestions in “Nudge” seem like good ideas, and even, as with “Save More Tomorrow,” practical ones. The whole project, though, as Thaler and Sunstein acknowledge, raises some pretty awkward questions. If the “nudgee” can’t be depended on to recognize his own best interests, why stop at a nudge? Why not offer a “push,” or perhaps even a “shove”? And if people can’t be trusted to make the right choices for themselves how can they possibly be trusted to make the right decisions for the rest of us?
Like neoclassical economics, much democratic theory rests on the assumption that people are rational. Here, too, empirical evidence suggests otherwise. Voters, it has been demonstrated, are influenced by factors ranging from how names are placed on a ballot to the jut of a politician’s jaw. A 2004 study of New York City primary-election results put the advantage of being listed first on the ballot for a local office at more than three per cent-enough of a boost to turn many races. (For statewide office, the advantage was around two per cent.) A 2005 study, conducted by psychologists at Princeton, showed that it was possible to predict the results of congressional contests by using photographs. Researchers presented subjects with fleeting images of candidates’ faces. Those candidates who, in the subjects’ opinion, looked more “competent” won about seventy per cent of the time.
When it comes to public-policy decisions, people exhibit curious-but, once again, predictable-biases. They value a service (say, upgrading fire equipment) more when it is described in isolation than when it is presented as part of a larger good (say, improving disaster preparedness). They are keen on tax “bonuses” but dislike tax “penalties,” even though the two are functionally equivalent. They are more inclined to favor a public policy when it is labelled the status quo. In assessing a policy’s benefits, they tend to ignore whole orders of magnitude. In an experiment demonstrating this last effect, sometimes called “scope insensitivity,” subjects were told that migrating birds were drowning in ponds of oil. They were then asked how much they would pay to prevent the deaths by erecting nets. To save two thousand birds, the subjects were willing to pay, on average, eighty dollars. To save twenty thousand birds, they were willing to pay only seventy-eight dollars, and to save two hundred thousand birds they were willing to pay eighty-eight dollars.
What is to be done with information like this? We can try to become more aware of the patterns governing our blunders, as “Predictably Irrational” urges. Or we can try to prod people toward more rational choices, as “Nudge” suggests. But if we really are wired to make certain kinds of mistakes, as Thaler and Sunstein and Ariely all argue, we will, it seems safe to predict, keep finding new ways to make them. (Ariely confesses that he recently bought a thirty-thousand-dollar car after reading an ad offering FREE oil changes for the next three years.)
If there is any consolation to take from behavioral economics-and this impulse itself probably counts as irrational-it is that irrationality is not always altogether a bad thing. What we most value in other people, after all, has little to do with the values of economics. (Who wants a friend or a lover who is too precise a calculator?) Some of the same experiments that demonstrate people’s weak-mindedness also reveal, to use a quaint term, their humanity. One study that Ariely relates explored people’s willingness to perform a task for different levels of compensation. Subjects were willing to help out-moving a couch, performing a tedious exercise on a computer-when they were offered a reasonable wage. When they were offered less, they were less likely to make an effort, but when they were asked to contribute their labor for nothing they started trying again. People, it turns out, want to be generous and they want to retain their dignity-even when it doesn’t really make sense.