Moral Courage in the Trump Era. A photo of President Donald Trump ilustrates a point about moral courage.
Moral Courage in the Trump Era. The psychology of moral courage offers insights into who stands up for what's right and who doesn't, a relevant topic amid worries about Trump's presidency. Here, President Donald Trump speaks with reporters before departing on Marine One from the South Lawn of the White House, Friday, July 25, 2025. Credit: AP Photo/Alex Brandon

Senator Lisa Murkowski, the Alaska Republican, has acknowledged on national television that “we are all afraid.” At the same time, signs declaring “courage is contagious” are familiar at anti-MAGA protests. Questions of courage—specifically moral courage—are more salient these days than they have been for decades. Many Americans are facing questions they never had to ask themselves: What would I do in a crisis? Would I speak up? Would I help someone in need?

Most people say yes. But they’re wrong. Research has shown that not only are few of us Nelson Mandela or Tiananmen Square tank man, but most of us even have a hard time mildly protesting when a dinner party guest tells an offensive joke.

 “Everyone thinks, ‘I would resist—I’m a strong person, I’m moral and have values, and nothing comes between me and my values,’” said Philip Mazzocco, an associate professor of psychology at The Ohio State University at Mansfield. “Nope, we’ve run studies for a hundred years. Everything comes between you and your values.”

Most humans are social beings who want to conform, be liked, and feel they belong. Some of that is learned, some instinctual; being excluded from the pack had evolutionary consequences—often death.

But throughout history, there have been those who stand up, risking everything. Decades of research have tried to understand why some act heroically once or over long periods, such as wartime resistors. Not surprisingly, the combination of genetics, the environment we grew up in—whether there was an emphasis on, say, obedience to authority, or political involvement—and societal culture all play a role in who speaks up.

Other factors include group norms, peer pressure, and the relationship to the victim or perpetrator. The bystander effect, for example, is a well-known phenomenon that shows, in general, that the more people observe during an emergency—rather than help—the less likely any one person will step up to intervene.

What is especially interesting is what research has shown about an individual’s inner traits and moral courage. Individuals who tend to intervene or speak up in a situation they perceive as wrong often have a strong sense of dispositional self-efficacy—that is, they believe they can cope with a wide range of challenging situations.

Conversely, moral disengagement acts as a significant barrier to intervention. Moral disengagement allows people to rationalize inaction; Albert Bandura, an influential Canadian-American psychologist who, among other things, studied aggression, categorized eight cognitive mechanisms that enable people to uncouple their moral values from their actions.

They include displacement of blame (I was just following orders), dehumanizing others, blaming victims, and comparing one’s harmful actions to even worse ones to make them seem better.

Another is that those who are angry when witnessing some moral or ethical violation seem more likely to act, said Anna Baumert, a professor of psychology focused on justice and morality at the University of Wuppertal, Germany.

Not just angry, but angrier than others.

“Those people for whom the anger went up more steeply, they were the ones that more likely intervened,” she said.

She and her colleagues found this to be true with a sampling of average people recruited for their study and another study in German-speaking countries of those receiving recognition for their moral courage.

“What really stood out was that the award winners were more anger-prone, so that means that they tend to react with stronger anger than people who haven’t intervened,” she added.

Anger, which can focus the mind, can swing both ways. It can drive someone to find their moral courage and defend a person being bullied, but it can also create bullies.

That’s why, Baumert said, anger—and other emotions that can swamp people during stressful situations—must be accompanied by tools to improve things, not make them worse.

These educational tools can help people understand what occurs physiologically and emotionally when faced with a situation that goes against their conscience and how quickly almost all of us can rationalize not acting.

Such training involves intervening appropriately and practicing what has been learned through role-playing.

A fair amount of research on nurses, who frequently face ethical dilemmas, demonstrates that such education can heighten moral sensitivity, improve the quality of care, and reduce errors.

The trouble is, since most people believe they will demonstrate moral courage when tested, they don’t see any need to prepare for such events.

Mazzocco, the Ohio State professor, recently co-authored a study based on the famous 1960s Milgram experiments at Yale University. That experiment involved volunteers (called teachers) who were told they were assisting in an experiment about punishment and learning, when it was about authority and obedience.

An experimenter told the teacher that he should shock a subject (an actor strapped into something like an electric chair) every time he gave a wrong answer and raise the voltage with every incorrect answer. The voltage could increase from 1 (a mild 15-volt shock) to 31, a highly dangerous 450 volts.

The shocks were fake (unknown to the teacher), but the actors screamed as the power increased. Every subject went up to at least 300 volts, and 65 percent administered the full 450 volts, albeit with anguish. The findings have been widely replicated.

For ethical reasons, these types of studies were stopped in 1973, but experiments that don’t expose participants to the same level of anxiety have shown the same results.

Mazzocco’s study asked some 400 adults—half who were told of the Milgram findings and half who weren’t—how they predicted they would behave in this scenario. Whether or not they knew of the original experiment, most of those in Mazzocco’s study believed they would quit the Milgram experiment around the seven-volt level.

They also believed the average person would go above seven to around 12.

“It’s somewhat sobering and demoralizing,” Mazzocco said. “I’ve been teaching the Milgram study for approximately 25 years now to thousands and thousands of students for a reason—because I thought I was impacting them and influencing them. But when it comes to self-perceptions, we have these very powerful drives to feel like we have integrity, to feel like we’re moral.”

Why does this matter? Because if we believe we have the courage, we are hardly inclined to understand why we probably don’t—and what we can do about it.

“You may find yourself on the streets, and a van with an unmarked vehicle with individuals with no badges or warrant shows up, and they grab someone. And now the question is put to you, am I going to be a nonresponsive bystander?” Mazzocco said. “If I try to intervene, and someone shouts at me, what am I going to do? People underestimate the power of these social forces in these moments.”

Our ideas can save democracy... But we need your help! Donate Now!

Alina Tugend is an award-winning journalist and the author of the book Better by Mistake: The Unexpected Benefits of Being Wrong