Thinking About How We Think
The Role of Emotion in Decision-Making
It’s a long-standing myth that emotion and logic operate independently, but it takes effort to be sure they’re working together in our best interest. How can we leverage the gift of human emotion to make well-considered decisions?
In Western cultures, it’s rarely a compliment to be told we’ve made “an emotional decision,” or even to be thought of as “an emotional person.” We’re conditioned to think that making good decisions means stripping all emotion from the equation in favor of purely “rational” thought, despite the fact that we know this isn’t how the brain works.
The truth is, all people are emotional in the sense that we’re born with a rich capacity for emotion, unless something rare or traumatic intervenes to prevent that. Barring such an intervention, however, we can’t think rationally without an emotional component—and we shouldn’t hope to. It’s true that an overwhelming emotion can overpower our capacity for deliberation and self-control if we allow it, but it’s not true that emotion is the enemy, or that we make good decisions by reason alone.
So what do people mean when they talk about “emotional decision-making”?
We often use this phrase to describe knee-jerk reactions driven by a strong emotion that’s perceived as negative, such as anger or outrage, or an emotion that activates (or is activated by) a subtle bias we haven’t paused long enough to identify. All it takes is a cursory look at human history to find widescale tragic examples of this. It’s how entire populations have been led to participate in the inhumane treatment of others, and it’s been a recurring pattern in the rise and fall of many nations.
Charismatic personalities seem to have no trouble manipulating their followers simply by activating the normal biases we’re all subject to. And while we might expect that our emotions are the prime doorway to biased thinking, that’s not always the case. Sometimes a seemingly rational argument can be just as effective in manipulation. The common misconception that brain functions are specialized to the point that we can divide our inner world between so-called rational thinking and emotion has created a false dichotomy. We’re not limited to a choice between emotions and rational thinking. In fact, as we’ve already noted, such a choice doesn’t exist.
Since both aspects of our inner world work in tandem, and both are subject to bias, it’s crucial to understand the relationship between them. Together, these two powerfully intertwined aspects of the human mind make up our capacity for reasoning. If we don’t understand this and aren’t alert to the influences that can trip us up, our capacity for critical thinking can be hijacked by people with dubious intentions who do understand it, and our thinking can be derailed.
This is true no matter who we are. We often hear people say that men aren’t as emotional as women and therefore are more logical in their thinking, but this long-standing myth is easily disproven. Like many common false beliefs, it may sound true to our personal experience; but it requires careful consideration and critical thinking to understand why it isn’t so. Some of the first studies into emotion and gender relied on self-report. This doesn’t take into account, however, that boys are told from a young age that “boys don’t cry” and they should “toughen up,” while it’s more accepted for girls to freely express emotion. Men who are raised to restrain their emotions are unlikely to recognize—much less report—what they’re feeling.
On the other hand, when emotional differences are studied using fMRI (functional magnetic resonance imaging), we see that men and women have the same initial emotional reactions to events, but they may regulate them somewhat differently. In general, women seem to regulate negative emotions by using positive ones to reappraise them, while men tend to regulate negative emotions by monitoring and restricting them—which makes sense when we remember they’ve likely each been trained from childhood to respond in these ways.
Fortunately the full range of human feeling isn’t restricted by gender; it’s available to all of us with practice, and it’s a rich legacy. Researcher Brené Brown names 87 emotions in Atlas of the Heart, but she doesn’t claim her list is complete. What she does say is this: “When we don’t understand how our emotions shape our thoughts and decisions, we become disembodied from our own experiences and disconnected from each other. . . . Having access to the right words can open up entire universes.”
“Without understanding how our feelings, thoughts, and behaviors work together, it’s almost impossible to find our way back to ourselves and each other.”
This is good news when it comes to all aspects of the human experience, including decision-making. We can all use emotional intelligence, which is so important to creativity and innovation, to help us find solutions to problems that require critical thinking. All of us are capable of harnessing both the emotional awareness and the self-control required to carefully consider the biases and emotional states that can influence our decisions.
So how do we do that?

Rethinking Our Emotional Triggers
One important challenge in thinking critically about our decisions involves how we respond to triggering words, phrases or ideas. These are terms or concepts that act as emotional tripwires as soon as we hear them, whether they’re political buzzwords, loaded phrases, or ideas that seem to conflict with our deeply held beliefs. When we encounter these triggers, our brain’s threat-response system can have us reacting in indignation before we’ve had time to wonder whether the emotion is justified or whether a different one (or even a mix of emotions) might be called for instead. This automatic process is natural but becomes a problem when it prevents us from engaging in productive dialogue or considering complex perspectives.
Ancient Wisdom, Modern Understanding
The ancient wisdom literature of the Bible offers a surprisingly nuanced view of emotion and decision-making. The biblical concept of wisdom embodies both emotional and intellectual facets. For instance, when Solomon asked for wisdom, he received what Scripture calls “a wise and understanding heart” (1 Kings 3:12, New King James Version throughout unless otherwise noted). Also, “the heart of the wise teaches his mouth, and adds learning to his lips” (Proverbs 16:23).
These examples, and others like them, suggest an intimate connection between emotion and wisdom. Neuroscientific understandings of how emotion and reason—inseparable aspects of the mind—work together in decision-making show the truth of this holistic view.
Daniel Kahneman’s work on how to reduce noise and bias in judgment echoes ancient warnings about the importance of thoughtful consideration before action: “The heart of the righteous studies how to answer, but the mouth of the wicked pours forth evil” (Proverbs 15:28).
As we navigate increasingly complex decisions in our modern world, it can be helpful to note where current research aligns with ancient wisdom. Church leaders encouraged Jesus’ followers to deepen their understanding of the intricate dance between emotion and reason, “that your love may abound even more and more in knowledge and every kind of insight so that you can decide what is best” (Philippians 1:9–10, NET Bible).
This mandate is as applicable today as ever.
The good news is that we can train ourselves to be more conscious of emotional triggers so they don’t lead us to the wrong assumptions; but first we have to be willing to question our initial responses. To understand how important it is to stop and question, it’s helpful to understand a little bit about how we come to conclusions.
Daniel Kahneman (1934–2024) was an Israeli-American psychologist well known for the work he did with Amos Tversky on decision-making. Though Tversky died in 1996, their research won Kahneman the 2002 Nobel Prize in Economics. They uncovered two modes of human thinking, which they simply called “System 1” and “System 2.” In his book Thinking, Fast and Slow, Kahneman summarized their roles: “System 1 operates automatically and quickly, with little or no effort and no sense of voluntary control. System 2 allocates attention to the effortful mental activities that demand it. . . . The operations of System 2 are often associated with the subjective experience of agency, choice, and concentration.”
Kahneman described System 1 as the effortless origin of the deliberate choices and beliefs arrived at by System 2. It’s educated to some degree by things that happen beneath our awareness. It includes skills we’re born with, but we add to that list through our experiences. As situations become familiar, we no longer need to engage System 2 to determine how to navigate them; we could say they simply get added to System 1’s library to be used for quick judgments in the future.
It might be tempting to think of System 1 as our emotional side and System 2 as our logical side, but that wouldn’t be accurate. Our emotions aren’t confined to one system or the other. Rather, they can activate and be activated by either system, as can our biases, intentions and actions. Likewise, there’s not a specific location in the brain that houses either system. They’re more like programs that help us think. One is automatic and the other more effortful.
As the background program, System 1 is constantly offering suggestions in the form of impressions, intuitions, impulses, intentions and feelings. If System 2 endorses these, says Kahneman, “impressions and intuitions turn into beliefs, and impulses turn into voluntary actions.” This works okay most of the time, and System 2 can run with little effort. But all systems are subject to bias, so System 1 can make mistakes.
“Unfortunately . . . our snap impressions can be wrong. They can be based on unfair and inaccurate stereotypes or manipulated by con artists. And once established, they can be tough to reconsider and change.”
As Kahneman explains, System 1 “sometimes answers easier questions than the one it was asked, and it has little understanding of logic and statistics.” A further limitation, he says, is that it can’t be turned off. This makes us vulnerable to unexpected emotional arousal when we come across ideas or situations we associate with emotionally charged experiences.
In other words, we can be emotionally triggered.
To manage these triggers, we have to identify the emotions we feel and the potential biases that might support them. Do we feel anger? Fear? Regret? Do we feel an important value has been challenged? Is it a complex combination of feelings that we’re uneasy about because we can’t identify them?
Instead of classifying emotions as simply “positive” or “negative,” it can help to develop a more nuanced emotional vocabulary so we can more precisely pinpoint what we’re feeling. This skill, known as emotional granularity, helps us make better decisions about how to respond. For example, instead of simplifying a feeling with the label “angry,” we might probe a little deeper to find we’re actually frustrated by a lack of progress, or disappointed in an outcome we weren’t expecting. We might be deeply concerned about what we perceive as the implications of what we’re hearing, or defensive about our position and reluctant to move away from what feels familiar.
Until we understand the source of our reaction (is the apparent anger in fact frustration?), we won’t be able to engage thoughtfully with the underlying message that triggered it. Whether we end up agreeing or disagreeing with that message, our reasoning won’t be reliable if we’re unable to first identify the emotion it provokes and then think critically about whether it’s leading us to a justifiable conclusion. Should we be feeling compassion instead of anger, for instance? Or curiosity instead of contempt? It’s only after we possess this valuable information about what we’re feeling, and why, that the real work is possible: the often difficult task of sincerely questioning whether our response is valid.
Unknowing What We Think We Know
Since our first response isn’t always accurate, and we can’t always endorse what our first response is suggesting, we have to be willing to rethink our default positions and examine our biases. Regardless of our experience and training, we all have them; our brains are wired with them, and they serve an important purpose. But we aren’t always aware when they’ve been activated, and this can trip us up unless we make the effort to call on System 2, our more deliberate thinking. The time to be alert and cautious is when we first notice we’re having a strong emotional reaction to certain words or ideas, especially when we’re certain we’re right.
Rethinking our first response is so important to organizational psychologist Adam Grant that he wrote a book about it in 2021. “When people reflect on what it takes to be mentally fit, the first idea that comes to mind is usually intelligence,” he writes in Think Again: The Power of Knowing What You Don’t Know. “The smarter you are, the more complex the problems you can solve—and the faster you can solve them. Intelligence is traditionally viewed as the ability to think and learn. Yet in a turbulent world, there’s another set of cognitive skills that might matter more: the ability to rethink and unlearn.”
“Good judgment depends on having the skill—and the will—to open our minds. I’m pretty confident that in life, rethinking is an increasingly important habit. Of course, I might be wrong. If I am, I’ll be quick to think again.”
Through 11 chapters and multiple examples, Grant demonstrates how crucial this skill is—and how difficult it is for us to accept that we need it: “We don’t just hesitate to rethink our answers. We hesitate at the very idea of rethinking.”
To illustrate this human tendency, he points to the common belief that revising your answer on a multiple-choice test is a bad idea. This belief is so pervasive that a respected test-prep company cautions students against doing so, warning that it’s likely to result in changing from a right answer to a wrong one. As a result, says Grant, about 75 percent of students believe their first instinct is most likely the right one. Yet, “when a trio of psychologists conducted a comprehensive review of thirty-three studies,” he writes, “they found that in every one, the majority of answer revisions were from wrong to right.” This may be because students only change their answer if they’re confident the second answer is correct. But Grant notes that “recent studies point to a different explanation: it’s not so much changing your answer that improves your score as considering whether you should change it.”
Oddly enough, students remained just as reluctant to revise their answers even after learning about this fallacy.

The Mind’s Blind Spots
Our brains are wired with numerous biases, or automatic shortcuts, that can affect our decision-making. A few key biases that can get in the way of sound thinking include
- Confirmation Bias: our tendency to seek out information that confirms our existing beliefs while dismissing evidence that contradicts them;
- In-Group Bias: the tendency to favor people we perceive as being part of our group;
- Availability Bias: giving more weight to information that easily comes to mind;
- First-Instinct Fallacy: the idea that our initial response is usually the best or most accurate;
- Status-Quo Bias: the tendency to oppose actions that might change one’s current situation;
- Dunning-Krueger Effect: the tendency to overestimate one’s knowledge or ability in a particular area.
It takes courage to question our own thinking, but we can ask ourselves what we might learn if we override our discomfort and do it anyway. Sure, we might learn something that requires us to consider something new or maybe even forces us to face the fact that some of what we’ve always thought has been wrong. On the other hand, we may actually find evidence that bolsters our existing beliefs, giving them a foundation that’s stronger than before. If we do find we’ve based some of our opinions on incomplete or inaccurate information, we can only benefit from changing them so they genuinely align with our values. Holding on to opinions that don’t align, simply because the thought of change activates our triggers, shuts down our potential for growth.
Good decision-making requires the humility to be intellectually honest about both the factual information and the emotional information we use to reach our conclusions. It requires us to think about what we think and feel and why we think and feel it, and to be flexible enough to change when the facts call for it.
“As I’ve studied the process of rethinking, I’ve found that it often unfolds in a cycle. It starts with intellectual humility—knowing what we don’t know.”
Joe Pierre, a forensic psychologist and professor at the University of California’s San Francisco campus, cautions that “naïve realism—that is, overconfidence in our own subjective intuitions, experience, and worldviews—is a major cognitive pitfall that puts all of us at risk of holding on too tightly to false beliefs while stubbornly insisting that we’re right.” But these beliefs don’t only come from our own subjective experience. “Much of what we believe,” he says, “is based on what we hear, read, or otherwise learn from others.”
Intellectual humility requires us to acknowledge that what we see or think we know is not all there is; there may be much more beyond our perception. “Perceiving and thinking occur within a mental world filled with biases, emotions, desires, beliefs, and other attendant concerns of the moment,” write University of Virginia psychologist Dennis Proffitt and Business Insider journalist Drake Baer. “It’s not so much that you’ll believe it when you see it, but what you believe shapes what you see.” Some of these influences on our perception nudge us toward intellectual dishonesty: looking for the answer we want to be right instead of looking for the right answer.
To Proffitt and Baer, “the key to defusing these biases is to get people to think less automatically.” In other words, question System 1 and engage System 2.
The goal isn’t to eliminate emotional responses in an effort to be purely rational. As we’ve seen, emotions are an essential part of human decision-making. Instead, we want to develop the ability to recognize whether biases or fallacies of any kind are influencing our emotions or our thinking. The goal is to develop a better understanding of how both forms of reasoning work together to influence us, and to work with them rather than against them. Integrating emotional intelligence and analytical thinking leads to decisions that better serve our best interests and truest values, but “it takes confident humility to admit that we’re a work in progress,” writes Grant. “It shows that we care more about improving ourselves than proving ourselves.”
Working toward this goal doesn’t benefit only us; it benefits everyone around us too. As we have the courage to examine our own biases, along with humility and honest curiosity in learning where others are actually coming from (rather than where we assume they’re coming from), we build stronger relationships. This, in turn, allows us to navigate triggering situations with greater ease and make better decisions together, even when circumstances are emotionally charged.