This is an exciting day (but also a bit nerve-racking).
After a lot of hard work, “The Upside of Irrationality” is finally out, and now all I can do is to stand by and see how people react to it. The Upside of Irrationality covers different topics from Predictably Irrational, but it is also much more personal (hence the nerve-racking part).
Here is a short intro to this book, and if you end up reading it, let me know what you think:
I am going to be on a book tour for a few weeks, and here is a list of places that I will give talks at:
NEW YORK
TUESDAY, JUNE 1; 7:00 pm
B&N
2289 Broadway at 82nd Street
SAN FRANCISCO
THURSDAY, JUNE 3, 7:30PM
Berkeley Arts & Letters
Hillside Club
2286 Cedar Street
Berkeley CA 94709
MONDAY JUNE 7, 7:30PM
The Booksmith
1644 Haight Street
San Francisco CA 94117
SEATTLE
TUESDAY JUNE 8, 7:30PM
Town Hall Seattle
1119 8th Avenue
Seattle, WA
WEDNESDAY, JUNE 9, Noon
Chamber of Commerce lunch
Rainier Square Conference Center
5th and University
BOSTON
THURSDAY, JUNE 10, 6:00PM
Brattle Theater
40 Brattle Street
Cambridge
WASHINGTON, D.C.
SATURDAY, JUNE 12, 3:30PM
Politics & Prose
5015 Connecticut Avenue NW
ST. LOUIS
MONDAY, JUNE 14, 7:00PM
St. Louis County Library
1640 S. Lindbergh Blvd.
CHARLOTTE NC
TUESDAY, JUNE 15, 7:00PM
Joseph Beth Booksellers
4345 Barclay Downs Dr.
DURHAM NC
MONDAY, JULY 19, 7:00PM
Regulator Bookstore
720 9th Street
Durham, NC
When businesses want to find answers to questions in marketing, whom do they ask? Do they set up experiments to test their ideas, pitting the approach they think is most effective against alternatives? Do they survey consumers on a large scale? Do they go to experts who have questioned and requestioned their theories? Surprisingly, the answer is no. Most often, businesses rely on small “focus groups” to answer big questions. They rely on the intuition of about 10-12 lay people with no relevant training who ultimately have no idea what they’re talking about.
I wonder how can this be a useful strategy? Why ask those who are lacking any kind of proficiency when, by definition, experts are more knowledgeable on the topic and have experience that could actually be beneficial? And even if experts are more narrowly focused, and tunneled vision, how can this be better than carrying out their own research?
Research in psychology and behavioral economics has shown time after time that people have bad intuitions. We are very good at explaining our behavior (sometimes shocking and irrational), and to do so we create neatly packaged stories – stories that may be amusing or provocative, but often have little to do with the real causes of our behaviors. Our actions are often guided by the inner primitive parts of our brain – parts that we can’t consciously access — and because of that we don’t always know why we behave in the ways we do; still, we can compensate for this lack of information by writing our own versions. Our highly sophisticated prefrontal cortex (only recently developed, by evolutionary standards) takes the reigns and paints a perfect picture to explain what we don’t know. Why did you buy that brand of fabric softener? Of course, because you love the way it makes your clothes smell like a springtime breeze when you pull them out of the warm dryer.
So, why do businesses go to our imagination when we know it’s just a cover for what’s really going on? Indeed, why do businesses go to the imaginations of a group of people to find real answers? I suspect that the story here is linked to another one of our irrationalities: As human beings, we have an insatiable need for a story. We love a vivid picture, a penetrating example, an anecdote that will stay in our memories. Nothing beats the feeling of knowledge we get from a personal story because stories make us feel connected – they help us relate. Just one example of customer satisfaction has a stronger emotional impact than a statistic telling us that 87% of customers prefer product A over product B. A single example feels real, where numbers are cold and sterile. Although statistics about how a large group of people actually behave can tell us so much more than the intuitions of a focus group, the allure of a story is irresistible. Our inherent bias to prefer the story compels us to believe in the worth of small numbers, even when we know we shouldn’t.
This “focus group bias” is not just a waste of money it is also most likely a waste of resources when products are designed according to the “information” gathered from these focus groups. We need to find a way to base our judgments and decisions on real facts and data even if it seems lifeless on its own. Maybe we should try and supplement the numbers with a story to quench our thirst for an anecdote, but what we can’t do is forget about the facts in favor of fairy tales. In the end, the truth lies in empirical research.
—
After a short break, my podcast (Arming the Donkeys) is back..
As of today the paperback version of PI is out.
I took out the parts about the stock market, and added 2 new chapters: one about the effects of social norms, and one about the cycle distrust in marketers and markets.
Sadly I cannot distribute these digitally, but if you are every in a book store or a library (or if you think that this worth $10) ….
Irrationally yours
Dan
Jeff Monday has a unique talent of taking topics and explaining them in a simple graphical way. here is his approach of describing intrinsic motivation.
Thanks again Jeff…
A few years ago, a marketing team from a major consumer goods company came to my lab eager to test some new pricing mechanisms using principles of behavioral economics. We decided to start by testing the allure of “free,” a subject my students and I had been studying. I was excited: The company would gain insights into its customers’ decision making, and we’d get useful data for our academic work. The team agreed to create multiple websites with different offers and pricing and then observe how each worked out in terms of appeal, orders, and revenue.
Several months later, right before we were due to go live, we had a meeting about the final details of the experiment—this time with a bigger entourage from marketing. One of the new members noted that because we were extending differing offers, some customers might buy a product that was not ideal for them, spend too much money, or get a worse deal overall than others. He was correct, of course. In any experiment, someone gets the short end of the stick. Take clinical medical trials, I said to the team. When testing chemotherapy treatments, some patients suffer more so that, down the road, others might suffer less. I hoped this put it in perspective. Fortunately, I said, price testing household products requires far less suffering than chemo trials.
But I could tell I was losing them. In a sense, I was impressed. It was a beautiful human sentiment they were conveying: We care about all customers and don’t want to treat any one of them unfairly. A debate ensued among the group: Are we willing to sacrifice some customers “just” to learn how the new pricing approaches work?
They hedged. They asked me what I thought the best approach was. I told them that I was willing to share my intuition but that intuition is a remarkably bad thing to rely on. Only an experiment gives you the evidence you need. In the end, it wasn’t enough to convince them, and they called off the project.
This is a typical case, I’ve found. I’ve often tried to help companies do experiments, and usually I fail spectacularly. I remember one company that was having trouble getting its bonuses right. I suggested they do some experiments, or at least a survey. The HR staff said no, it was a miserable time in the company. Everyone was unhappy, and management didn’t want to add to the trouble by messing with people’s bonuses merely for the sake of learning. But the employees are already unhappy, I thought, and the experiments would have provided evidence for how to make them less so in the years to come. How is that a bad idea?
Companies pay amazing amounts of money to get answers from consultants with overdeveloped confidence in their own intuition. Managers rely on focus groups—a dozen people riffing on something they know little about—to set strategies. And yet, companies won’t experiment to find evidence of the right way forward.
I think this irrational behavior stems from two sources. One is the nature of experiments themselves. As the people at the consumer goods firm pointed out, experiments require short-term losses for long-term gains. Companies (and people) are notoriously bad at making those trade-offs. Second, there’s the false sense of security that heeding experts provides. When we pay consultants, we get an answer from them and not a list of experiments to conduct. We tend to value answers over questions because answers allow us to take action, while questions mean that we need to keep thinking. Never mind that asking good questions and gathering evidence usually guides us to better answers.
Despite the fact that it goes against how business works, experimentation is making headway at some companies. Scott Cook, the founder of Intuit, tells me he’s trying to create a culture of experimentation in which failing is perfectly fine. Whatever happens, he tells his staff, you’re doing right because you’ve created evidence, which is better than anyone’s intuition. He says the organization is buzzing with experiments.
And so is that consumer goods company. A group there is studying consumer psychology and behavioral economics and is amassing evidence that’s impressive by any academic standard. Years after our false start, they’re recognizing the dangers of relying on intuition.
Question: what are God’s views on affirmative action, the death penalty and same-sex marriage? Answer: whatever you want them to be.
That’s according to a recent study by Nicholas Epley, Benjamin Converse, Alexa Delbosc, George Monteleone and John Cacioppo, found that we tend to ascribe our own views to God.
Past studies have shown that when we reason about other people, we form an opinion of their views based on two sources: egocentric info (i.e., what we ourselves believe) and outside clues (what the other person has said and done, and what others have said about them).
Here, the researchers wanted to find out how much we rely on egocentric info to construe other people’s views, including God’s. To that end, they had devout American participants provide their personal views on various issues (abortion, death penalty, Iraq war, etc.), as well as what they thought were the views of others (Katie Couric, George Bush, the average American, God, etc.).
When the researchers compared participants’ personal views with the participants’ estimates of others’ views, they found one significant pattern: there was a correlation between participants’ personal views and their estimates of God’s view. For example, participants who said they were for same-sex marriage tended to also say that God was for same-sex marriage. And participants who said they were against same-sex marriage tended to also say that God was against same-sex marriage.
But this wasn’t the case for the other figures – Couric, Bush, average American, and so forth. Participants who said they were for same-sex marriage were statistically neither more nor less likely to say that Couric was for same-sex marriage than those who held the opposite view. In other words, what I say Couric thinks has nothing to do with what I myself think. But what I say God thinks has lots to do with what I myself think.
But correlation doesn’t imply causation, so to shed light on the direction of causality, the researchers ran two follow-up experiments. This time, instead of just surveying participants for current views, they induced participants to change their personal views by randomly assigning them to give speeches for or against the issue (death penalty) in front of a camera. Because it was random assignment, some people ended up arguing for their personal view, while others argued against it (many past studies have shown that in this context, people tend to shift their own opinions in a direction consistent with the speech they delivered). So, what about the other views (God’s, Couric’s etc.) – would the participant revise those as well?
Yes and no. The only other view that changed was God’s. As participants’ own views changed, so did their estimates of God’s view. The participant who started out very much for the death penalty but took on a more moderate view after arguing against the death penalty on camera also ascribed a more moderate view to God. But his estimates of the others’ views remained unchanged.
Overall these results suggest that God is a blank slate onto which we project whatever we choose to. We join religious communities that argue for our viewpoint and we interpret religious readings to support our personal positions.
Irrationally Yours,
Dan
p.s and happy birthday to my little sister Tali
Jeff Monday has a unique talent of taking topics and explaining them in a simple graphical way. here is his approach of describing relativity and immediate gratification
Thanks Jeff…
The heat of the moment is a powerful, dangerous thing. We all know this. If we’re happy, we may be overly generous. Maybe we leave a big tip, or buy a boat. If we’re irritated, we may snap. Maybe we rifle off that nasty e-mail to the boss, or punch someone. And for that fleeting second, we feel great. But the regret—and the consequences of that decision—may last years, a whole career, or even a lifetime.
At least the regret will serve us well, right? Lesson learned—maybe.
Maybe not. My friend Eduardo Andrade and I wondered if emotions could influence how people make decisions even after the heat or anxiety or exhilaration wears off. We suspected they could. As research going back to Festinger’s cognitive dissonance theory suggests, the problem with emotional decisions is that our actions loom larger than the conditions under which the decisions were made. When we confront a situation, our mind looks for a precedent among past actions without regard to whether a decision was made in emotional or unemotional circumstances. Which means we end up repeating our mistakes, even after we’ve cooled off.
I said that Eduardo and I wondered if past emotions influence future actions, but, really, we worried about it. If we were right, and recklessly poor emotional decisions guide later “rational” moments, well, then, we’re not terribly sophisticated decision makers, are we?
To test the idea, we needed to observe some emotional decisions. So we annoyed some people, by showing them a five-minute clip from the movie Life as a House, in which an arrogant boss fires an architect who proceeds to smash the firm’s models. We made other subjects happy, by showing them—what else?—a clip from the TV show Friends. (Eduardo’s previous research had established the emotional effects of these clips).
Right after that, we had them play a classic economics game known as the ultimatum game, in which a “sender” (in this case, Eduardo and I) has $20 and offers a “receiver” (the movie watcher) a portion of the money. Some offers are fair (an even split) and some are unfair (you get $5, we get $15). The receiver can either accept or reject the offer. If he rejects it, both sides get nothing.
Traditional economics predicts that people—as rational beings—will accept any offer of money rather than reject an offer and get zero. But behavioral economics shows that people often prefer to lose money in order to punish a person making an unfair offer.
Our findings (published in Organizational Behavior and Human Decision Processes) followed suit, and, interestingly, the effect was amplified among our irritated subjects. Life as a House watchers rejected far more offers than Friends watchers, even though the content of the movie had nothing to do with the offer. Just as a fight at home may sour your mood, increasing the chances that you’ll send a snippy e-mail, being subjected to an annoying movie leads people to reject unfair offers more frequently even though the offer wasn’t the cause of their mood.
Next came the important part. We waited. And when the emotions evoked by the movie were no longer a factor, we had the participants play the game again. Our fears were confirmed. Those who had been annoyed the first time they played the game rejected far more offers this time as well. They were tapping the memory of the decisions they had made earlier, when they were responding under the influence of feeling annoyed. In other words, the tendency to reject offers remained heightened among our Life as a House group—compared with control groups—even when they were no longer irritated.
So now I’m thinking of the manager whose personal portfolio loses 10% of its value in a week (entirely plausible these days). He’s frustrated, angry, nervous—and all the while, he’s making decisions about the day-to-day operations of his group. If he’s forced to attend to those issues right after he looks at his portfolio, he’s liable to make poor decisions, colored by his inner turmoil. Worse, though, those poor decisions become part of the blueprint for his future decisions—part of what his brain considers “the way to act.”
That makes those strategies for making decisions in the heat of the moment even more important: Take a deep breath. Count backward from 10 (or 10,000). Wait until you’ve cooled off. Sleep on it.
If you don’t, you may regret it. Many times over.