Decisions and Desire
When we make decisions, we may not always be in charge. We can be too impulsive or too deliberate. Sometimes our emotions can get the better of us, and then we're paralyzed by uncertainty. Sometimes we'll pull a brilliant decision out of thin air-and wonder how we did it. Though we may not understand how decision making happens, neuroscientists peering into our brains are beginning to build an understanding. What they are finding may not be what you want to hear, but it's worth your while to listen.
The closer scientists look, the clearer it becomes how much we're like animals. We have dog-like brains with a human cortex stuck on top. This cortex is an evolutionary recent invention that plans, deliberates, and decides. But not a second goes by that our ancient dog brains aren't conferring with our modern cortexes to influence their choices without us even knowing it.
Using scanning devices that measure the brain's activity, scientists can glimpse how the different parts of our brain, ancient and modern, collaborate and compete when we make decisions. Science is not going to produce anytime soon a full roadmap for good decision making or for how to manipulate people's decisions. But the more we understand how we make decisions, the better we can manage them.
Into the Deep
Consider what going on in the brain when people play the ultimatum game. For example, one player has $10 to split with a second player—let’s say you’re the recipient. She can offer you any amount, from zero to $10, and she gets to keep the change—but only if you accept her offer. You are free to reject any offer, but if you do, neither of you gets anything. According to game theory, you should accept whatever she offers, however measly, because getting some money is better than getting none.
Of course, it doesn’t work like that. In these experiments, when the offer dwindles to a few dollars, people on the receiving end consistently turn it down, forfeiting a free couple of bucks for—well, for what, exactly? Ask these participants and they’ll tell you, in so many words, that they rejected the lowball offer because they were ticked off at the stingy partner (who, remember, loses her share, too). Not exactly a triumph of reason. This sounds like a dog brain at work, and it is.
Alan Sanfey, a cognitive neuroscientist at the University of Arizona, and colleagues used MRI scans to look into people’s brains while they played this game. As offers became increasingly unfair, the anterior insula, a part of the animal brain involved in negative emotions including anger and disgust, became more and more active, as if registering growing outrage. Meanwhile, part of the higher brain, an area of the prefrontal cortex involved in goal orientation (in this case, making money) was busy, too, assessing the situation. By tracking the activity of these two regions, Sanfey mapped what appeared to be a struggle between emotion and reason as each sought to influence the players’ decisions. Punish the bastard? Or take the money, even though the deal stinks? When the disgusted anterior insula was more active than the rational goal-oriented prefrontal cortex—in a sense, when it was shouting louder—the players rejected the offer. When the prefrontal cortex dominated, the players took the money.
Experiments like these illuminate the aggressive participation of our emotion-driven animal brains in all kinds of decision making. And they’re beginning to expose the complex dance of primitive brain circuits involved in feelings of reward and aversion as we make choices. In the ultimatum game, it certainly looks as if our dog brains sometimes hijack our higher cognitive functions to drive bad or, at least, illogical decisions. But, as we shall see, our animal brains play an important part in rational decision making as well.
Emotion and Reason
Most of us are taught from early on that sound decisions come from a cool head. The last thing one would want would be the intrusion of emotions in the methodical process of decision making. The high-reason view assumes that formal logic will, by itself, get us to the best available solution for any problem. To obtain the best results, emotions must be kept out. Building on that, we observe that patients with damage to the part of the prefrontal cortex that processes emotions often struggle with making even routine decisions.
A patient named Elliot was among the first to raise this weird possibility in Damasio’s mind 20 years earlier. Elliot had been an exemplary husband, father, and businessman. But he began to suffer from severe headaches and lose track of work responsibilities. Soon, his doctors discovered an orange-sized brain tumor that was pushing into his frontal lobes, and they carefully removed it, along with some damaged brain tissue. It was during his recovery that family and friends discovered (as Damasio put it) that “Elliot was no longer Elliot.” Though his language and intelligence were fully intact, at work he became distractible and couldn’t manage his schedule. Faced with an organizational task, he’d deliberate for an entire afternoon about how to approach the problem. Should he organize the papers he was working on by date? The size of the document? Relevance to the case? In effect, he was doing the organizational task too well, considering every possible option—but at the expense of achieving the larger goal. He could no longer effectively reach decisions, particularly personal and social ones, and despite being repeatedly shown this flaw, he could not correct it.
Though brain scans revealed isolated damage to the central (or ventromedial) portion of Elliot’s frontal lobes, tests showed that his IQ, memory, learning, language, and other capacities were fine. But when Elliot was tested for emotional responses, the true nature of his deficit emerged. After viewing emotionally charged images—pictures of injured people and burning houses—Elliot revealed that things that had once evoked strong emotions no longer stirred him. He felt nothing.
Damasio and his colleagues have since studied over 50 patients with brain damage like Elliot’s who share this combination of emotional and decision-making defects. And researchers have found that patients with injuries to parts of the limbic system, an ancient group of brain structures important in generating emotions, also struggle with making decisions. There’s something critical to decision making in the conversation between emotion and reason in the brain, but what?
Call it gut. Or hunch. Or, more precisely, “prehunch,” to use Damasio’s term. In a famous series of experiments designed by Damasio’s colleague Antoine Bechara at the University of Iowa, patients with Elliot’s emotion-dampening type of brain damage were found to be unusually slow to detect a losing proposition in a card game.
In the game, players picked cards from red and blue decks, winning and losing play money with each pick. The players were hooked up to lie-detector-like devices that measure skin conductance response, or CSR, which climbs as your stress increases and your palms sweat. Most players get a feeling that there’s something amiss with the red decks after they turn over about 50 cards, and after 30 more cards, they can explain exactly what’s wrong. But just ten cards into the game, their palms begin sweating when they reach for the red decks. Part of their brains know the red deck is a bad bet, and they begin to avoid it—even though they won’t consciously recognize the problem for another 40 cards and won’t be able to explain it until 30 cards after that. Long before they have a hunch about the red deck, a subconscious prehunch warns them away from it.
Though the brain-damaged patients eventually figured out that the red decks were rigged against them, they continued to pick red cards. What were they missing? The injured parts of their brains in the prefrontal cortex seemed unable to process the emotional signals that guide decision making. Without this emotion interpreter pushing them in the right direction (toward the winning decks), these patients were unable to act on what they knew. They couldn’t decide, apparently, what was in their own best interest. You could say they lacked good judgment.
Risk and Reward
You don’t have to be a neuroscientist to see how the emotional brain can badly distort judgment. Just ask any parent. From the toddler climbing the shelves to get candy to the teenager sneaking off for unprotected sex, kids have a dangerous shortage of common sense. Their bad behavior often looks consciously defiant (and sometimes it is), but the real problem may be that their brains haven’t yet developed the circuitry that judiciously balances risks and rewards to yield level-headed decisions. This is where the neuroscientists can offer special insight.
The brain’s frontal lobes, so critical to decision making, don’t fully mature until after puberty. Until then, the neuronal wiring that connects the prefrontal cortex to the rest of the brain is still under construction. Meanwhile, the parts of the brain that incite impulsive behavior seem particularly primed in teenagers. For instance, Gregory Berns and colleagues at Emory University found that certain still-developing circuits in adolescents’ brains become hyperactive when the kids experience pleasurable novel stimuli. An adolescent’s brain is wired to favor immediate and surprising rewards, even when the teen knows full well that pursuing them may be a bad idea.
In a sense, teenagers have yet to complete the wiring that manifests as willpower. The prefrontal cortex, it appears, is the seat of willpower—the ability to take the long-term perspective in evaluating risks and rewards. As such, this area of the brain is in close contact with the structures and circuits of the emotional animal brain that seek gratification and alert us to danger.
Much of the traffic between the primitive and modern parts of our brains is devoted to this conscious calculation of risks and rewards. Though animals’ reward and aversion circuitry is a lot like ours, unlike most animals, we can contemplate what might flow from a decision to chase immediate gratification. And we can get immediate pleasure from the prospect of some future gratification.
Thrill of the Hunt
Whether it’s reacting to a sexual conquest, a risky business deal, or an addictive drug, the brain often distinguishes clearly between the thrill of the hunt and the pleasure of the feast.
The brain’s desire for rewards is a principal source of bad judgment, in teenagers and adults alike. But it would be wrong to assign blame for ill-advised reward seeking to a single part of the brain. We are motivated to search for things we like and to let us know when we’ve found them. The brain regions that respond to cocaine or morphine are the same ones that react to the prospect of getting money and to actually receiving it. It’s perhaps no surprise that chocolate, sex, music, attractive faces, and sports cars also arouse this reward system. Curiously, revenge does too, as we shall see.
The reward circuits depend on a soup of chemicals to communicate, chief among them the neurotransmitter dopamine. Dopamine is often referred to as the brain’s “pleasure chemical,” but that’s a misnomer. It’s more of a pleasure facilitator or regulator. It helps to regulate the brain’s appetite for rewards and its sense of how well rewards meet expectations.
Well-regulated appetites are crucial to survival. Without these drives, our ancestors wouldn’t have hunted for food or pursued sexual partners, and you wouldn’t be here to read this article. By the same token, unchecked reward seeking isn’t very adaptive either, as patients with disrupted dopamine systems demonstrate. Consider what happened to Bruce, a computer programmer, who had had no history of psychiatric problems. Bruce had never been a gambler, but at the age of 41, he abruptly began compulsively gambling, frittering away thousands of dollars in a matter of weeks over the Internet. He began to shop compulsively, too, buying things that he neither needed nor wanted. And to his wife’s growing alarm, he began to demand sex several times a day.
Bruce’s story would be little more than a footnote in the medical literature but for one twist: He had Parkinson’s disease, and just before his compulsions began, his neurologist had added a new drug to his regimen—pramipexole—which relieves the tremors of the disease by mimicking dopamine. When Bruce described his worrisome new passions to his neurologist, the doctor, suspecting the pramipexole might be involved, advised him to reduce his dose. Bruce stopped taking the drug altogether, and two days later, his desires—to gamble, to shop, to have sex many times a day—simply vanished. It was, he said, “like a light switch being turned off.”
Cases like Bruce’s reveal the extraordinary power of our dopamine-fueled appetite for rewards—as distinct from the rewards themselves—to ride roughshod over reason. But what about the rest of us who go about our reward-seeking business in apparently more balanced ways? We clearly do a better job of weighing trade-offs than Bruce did, but much of the same circuitry is at work—and, as such, sometimes our pursuits aren’t as rational as we think they are.
Show Me the Money
Economists have assumed that people work because they place value on the things money can buy (or, in economic terms, they gauge “utility”). But neuroscience studies show how chasing money is its own reward. In one set of experiments, Stanford neuroscientist Brian Knutson used MRI to watch subjects’ brains as they reacted to the prospect of receiving money. Among the brain regions that lit up in this experiment was the nucleus accumbens, signaling in its primitive way, “You want this.” (Rats with electrodes planted near the accumbens will press a lever to stimulate the area until they drop from exhaustion.) The higher the potential monetary reward, the more active the accumbens became. But activity ceased by the time the subjects actually received the money—suggesting that it was the anticipation, and not the reward itself, that aroused them.
As Knutson puts it, the nucleus accumbens seems to act as a gas pedal that accelerates our drive for rewards, while the relevant part of the prefrontal cortex is the steering wheel that directs reward seeking toward specific goals. When it comes to making money, having the accumbens on the gas pedal is often desirable—it motivates high performance at work among other things. But when you step on the gas, you want to be pointed in the right direction.
It’s no surprise that the prospect of money or food or sex stimulates our reward circuits. But revenge? Consider Clara Harris. Her name may not ring a bell, but her case probably will. Harris is the Houston dentist who, upon encountering her husband and his receptionist-turned-mistress in a hotel parking lot in 2002, ran him down with her Mercedes. What was she thinking? According to an Associated Press report at the time of her murder conviction in 2003, Harris testified, “I didn’t know who was driving…everything seemed like a dream.” As she put it, “I wasn’t thinking anything.”
No one can know exactly what was going on in Harris’s mind when she hit the accelerator. But her own testimony and the jury’s conclusion that she acted with “sudden passion” suggest a woman in a vengeful rage whose emotional brain overwhelmed any rational deliberation. We do know that a desire to retaliate, to punish others’ bad behavior, however mild, even at personal cost, can skew decision making. Recall the ultimatum card game, in which a player could accept or reject another player’s offer of money. Alan Sanfey’s brain scans of people feeling vengeful in these games show how (at least in part) a sense of moral disgust manifests in the brain. But anyone who has settled a score knows that a desire for vengeance is more than an angry response to a bad feeling. Revenge, as they say, is sweet—even contemplating it is.
When University of Zurich researchers Dominique J.F. de Quervain, Ernst Fehr, and colleagues scanned subjects with a PET device during an ultimatum-like game, they found certain reward circuits in the brain’s striatum activated when players anticipated, and then actually punished, ill-behaved partners. What’s more, the greater the activation of the striatum, the greater the subjects’ willingness to incur costs for the opportunity to deliver punishment. At the same time, the researchers saw activation in the medial prefrontal cortex, the deliberative part of the higher brain that’s thought to weigh risks and rewards. Once again, neuroscientists seem to have caught on camera an engagement between the emotional and reasoning parts of the brain.
These same brain regions—the reward-seeking striatum and the deliberative prefrontal cortex, both of which are activated by the pleasing possibility of revenge—also light up when people anticipate giving rewards to partners who cooperate. Though the players’ behaviors are opposite—bestowing a reward versus exacting punishment— their brains react in the same way in eager anticipation of a satisfying social experience.
Fear and Loathing
Like the brain’s reward circuits, its systems for sensing and making decisions about risks are powerful and prone to error. Often this fact confronts us directly. Many people, for instance, have a paralyzing fear of flying that’s unrelated to its true risks. All the time, people make the irrational decision to travel by car rather than fly, believing on a gut level that it’s safer, even though they know it’s not.
This behavior is partly the work of the amygdala, a structure near the base of the brain. Colin Camerer, a behavioral and experimental economist at the California Institute of Technology, calls the amygdala an “internal hypochondriac,” which provides quick and dirty emotional signals in response to potential threats. It’s also been called the “fear site,” responsible for both producing fear responses and learning from experience to be afraid of certain stimuli. The amygdala responds instantaneously to all manner of perceived potential threats and pays particular attention to social cues. This leads to good and, often, very bad decisions.
Face Your Fear
Look at how the amygdala influences first impressions: Brain-scanning experiments show that it activates when people see spiders, snakes, frightening expressions, faces that look untrustworthy—and faces of another race. It’s easy to see how a “that’s a threat” response to a snake could guide good decisions, particularly a million years ago out on the savanna. But a gut reaction that says “watch out” when you see a face of a different race?
MRI studies have shown that the amygdala becomes more active when whites see black faces than when they see white faces; similarly, in blacks, the amygdala reacts more to white faces than black ones. Taken alone, this finding says nothing about people’s conscious attitudes. But research by Harvard social ethicist Mahzarin Banaji and colleagues shows that even people who consciously believe they have no racial bias often do have negative unconscious feelings toward “outgroups”—people not like themselves. Investigators have found, too, that the greater a person’s unconscious bias on these tests, the more active the amygdala.
On the one hand, we should be happy that our amygdalas warn us of potential dangers before our conscious brains grasp that something’s amiss. But a brain circuit that was indispensable to our ancestors, warning them away from legitimate threats like snakes, certainly contributes to an array of bad and irrational decisions today. In the case of our readiness to fear outgroups, think of the countless missed opportunities and just plain bad decisions made by good people who consciously hold no racial biases but who nonetheless have gone with an inchoate gut sense to withhold a job offer, deny a promotion, or refuse a loan because their amygdalas, for no good reason, said, “Watch out.”
Wheel of Misfortune
The amygdala’s role in warning us about perils real and imagined seems to extend even to the threat of losing money. In Breiter’s lab, researchers monitored brain activity while volunteers watched images of roulette-like wheels, each with a spinning arrow that would come to a stop on a particular dollar amount, either a gain, a loss, or zero. It was obvious at a glance that some wheels were likely to produce dollar wins while others were clearly losers. When the losing wheels spun, subjects’ amygdalas activated even before the arrows stopped, signaling their discomfort about the losses they saw coming.
Beyond the amygdala, the brain has another risk-aversion region that steers us from disagreeable stimuli. Recall in the ultimatum game that it was the anterior insula that reacted with disgust to the other player’s rotten offer; this region also activates when people think they’re about to experience pain or see something shocking. Like our reward-seeking circuitry, loss-avoidance circuits involving the amygdala and anterior insula serve us well—when they’re not driving us to overact and make bad decisions.
Consider investment decisions. Investors who should be focused on maximizing utility routinely take risks when they shouldn’t and don’t take risks when they should. (Among the biases that skew utility seeking is that people weigh equivalent losses and gains differently; that is, they feel better about avoiding a $100 loss than securing a $100 gain.) To see what’s going on in their heads when people make bad investment choices, Stanford researchers Camelia Kuhnen and Brian Knutson had volunteers play an investment game while their brains were scanned with MRI.
In the game, volunteers chose among two different stocks and a bond, adjusting their picks with each round of the game based on the investments’ performance in the previous round. While the bond returned a constant amount, one stock was more likely to make money over a series of trades (the “good” stock) and the other to lose money (the “bad” stock). Kuhnen and Knutson found that, even when players had developed a sense of which was the good stock, they’d still often head for the riskless bond after they’d made a losing stock choice—what the researchers called a risk-aversion mistake. In other words, even though they should have known to pick the good stock on each round, when they got stung with a loss they’d often irrationally retreat.
The MRI scans revealed this risk-aversiveness unfolding. Prior to choosing the safety of the bond, the players’ anterior insulas would activate, signaling their (perhaps not-yet-conscious) anxiety. In fact, the more active this primitive risk-anticipating brain region, the more risk averse players were—often to their own detriment.
Know Your Brain
Controversial though some of his ideas may be, Freud wasn’t so far off when he posited the struggle between the animalistic id and the rational superego. But he may have been too generous in his assessment of the superego’s ability to channel our emotions. Neuroscientists are showing that the emotional and deliberative circuits in the brain are in constant interaction (some would say struggle), and the former, for better or for worse, often holds sway. What’s more, with each new study it becomes clearer just how quickly, subtly, and powerfully our unconscious impulses work. Flash a picture of an angry or a happy face on a screen for a few hundredths of a second, and your amygdala instantly reacts—but you, your conscious self, have no idea what you saw.
Breiter believes the more we learn about the brain science of motivation, the more readily it can be applied in business. “People’s decision-making and management styles probably arise from common motivational impulses in the brain,” he points out. “If a manager is hardwired to be more risk seeking, or risk avoiding, or more driven to pursue a goal than to achieve it, that’s going to affect how he manages and makes decisions.” With our increasingly clear understanding of how basic motivations affect conscious decisions, Breiter says, it should be possible to tailor incentives accordingly. A manager who shows a preference for the hunt might, for instance, be well served by incentives that increase his motivation to reach goals rather than simply chase them.
Neuroscience research also teaches us that our emotional brains needn’t always operate beneath our radar. Richard Peterson, a psychiatrist who applies behavioral economics theory in his investment consulting business, advises clients to cultivate emotional self-awareness, notice their moods as they happen, and reflect on how their moods may influence their decisions. In particular, he advises people to pay close attention to feelings of excitement (a heightened expression of reward seeking) and fear (an intense expression of loss aversion) and ask, when such a feeling arises, “What causes this? Where did these feelings come from? What is the context in which I’m having these feelings?” By consciously monitoring moods and the related decisions, Peterson says, people can become more savvy users of their gut feelings.
This advice may sound familiar; it lies at the heart of books like Blink and Gary Klein’s The Power of Intuition, which promise to help readers harness their gut feelings. But for executives taught to methodically frame problems, consider alternatives, collect data, weigh the options, and then decide, cultivating emotional self-awareness may seem like a dispensable exercise—or at least not a critical tool in decision making. The picture emerging from the neuroscience labs is that you ignore your gut at your own peril. Whether you’re negotiating an acquisition, hiring an employee, jockeying for a promotion, granting a loan, trusting a partner—taking any gamble—be aware that your dog brain is busy in increasingly predictable, measurable ways with its own assessment of the situation and often its own agenda. You’d better be paying attention.