Good decision making is arguably the most critical skill for the modern knowledge worker or business leader. (Less so perhaps for classic assembly line factory work or moving packages around a warehouse.)

So it’s perhaps not surprising that there’s a wave of new books that cover how to make better decisions.

I wrote in January about Think Again by Adam Grant, which discusses how to rethink your own views and details the sometimes dramatic cost—the Space Shuttle tragedies, for example—of making decisions based on false beliefs and assumptions.

Now Julia Galef has released The Scout Mindset, a book about intellectual honesty and how it contributes to better decisions (featuring a promotional blurb from Grant atop its cover.) Dubbed “the tech elite’s favorite pop intellectual” by New York Magazine, Galef hosts the “Rationally Speaking” podcast and gave a widely viewed 2016 TED talk laying out the thesis for what would become her book.

(You can read all of our book briefings here. Sign up to receive these briefings by email for free here.)

She argues that we should strive to approach the world as a military scout might, with “the motivation to see things as they are, not as you wish they were.” (p. ix) The scout strives to create an accurate map, aiming to remove any false preconceptions or biases.

Galef contrasts this with the mindset of a soldier who seeks out evidence that supports their existing beliefs, and views being wrong as a defeat—a mindset rooted in a desire to preserve self esteem, avoid unpleasant emotions, motivate ourselves, and fit in socially. “You can see it in the way we rationalize away red flags in an exciting new relationship, and always think we’re doing more than our fair share of the work,” she writes, “When a coworker screws up, it’s because they’re incompetent, but when we screw up, it’s because we were under a lot of pressure.” (p. 6)

We’re all some combination of the scout and the soldier, but can shift toward having a scout mindset more often.

Galef says the scout mindset “keeps you from fooling yourself on tough questions that people tend to rationalize about.” Some workplace examples include: “Do I really have to fire that employee? How much do I need to prepare for that presentation tomorrow? Is it best for my company to raise a lot of new funding now or am I just tempted by the instant validation that raising funds would give me?” (p. 13)

Galef suggests some tools for making better decisions:

  • When you make a decision, ask yourself what kind of bias could be affecting your judgment. Some simple thought experiments help, such as the status quo bias test, where you ask whether you would actively choose your current situation if it wasn’t the status quo. The outsider test involves evaluating the situation as if it wasn’t your own. Galef recounts how the late Intel chief executive Andy Grove would ask himself what decision a new CEO arriving from the outside would reach, as a way to strip away some of the emotion swirling around tough choices. (p. 70)
  • Ask yourself how certain you really are about a decision. One test is how much money you would bet that an objective third party would come to the same conclusion. A more elaborate thought experiment Galef proposes (p. 86) allows you to identify what percent confidence you have in a decision.
  • Find a person or media organization that holds different views from you, but who you respect enough that they could potentially change your mind. Research shows that exposing yourself to opposite viewpoints can just entrench the ones you started with. But if you have enough in common with someone you disagree with, one of you might eventually change your mind.
  • Hold your identity lightly. Identifying yourself strongly as a member of a movement, group, or ideology can get in the way of good decision making.
  • Be self assured but not unrealistically confident. Research shows that people respond to social confidence in leaders, and don’t need absolute certainty or overpromising to be excited.
  • Lean into confusion. “Resist the urge to dismiss details that don’t fit your theories, and instead, allow yourself to be confused and intrigued by them,” Galef writes. (p. 152)
  • Look for opportunities to update your views. When the US intelligence community analyzed what allowed some people to more accurately forecast global events, they found that such “superforecasters” changed their minds a lot. They adapted to new evidence as they encountered it, and each update to their view was relatively low stakes.
  • When you make a wrong decision, analyze the reason why and learn from that. What was the blind spot or bias that led you astray? Superforecasters—who analyzed their errors—improved their average accuracy by 25% each year, while other forecasters didn’t improve at all.
  • When you realize you were wrong and someone you disagreed with was right, reach out to them and let them know. Abraham Lincoln doubted general Ulysses Grant’s planned tactic for seizing Vicksburg during the Civil War. When Grant succeeded, Lincoln wrote him to say “I thought it was a mistake. I now wish to make the personal acknowledgment that you were right and I was wrong.” (p. 51)

To be sure…

  • The scout metaphor has its limitations. Fortunately, you don’t have to buy into it to find Galef’s views helpful and thought-provoking.
  • Sections of the book are devoted to debunking the idea that self-delusion can be helpful for entrepreneurs and activists, whose force of conviction and clarity of message could be diminished by openness to contrary facts and ideas. For some readers, that debate might not feel relevant.

Memorable anecdotes and facts:

  • A sociologist who embedded in a Swedish company during the 1970s concluded that when the workers would hold meetings to decide on a project, they spent very little time comparing options. “Instead they quickly anchored on one option and spent most of the meeting raising points in favor of it,” Galef writes. (p. 21) The sociologist concluded that this approach was meant to build enthusiasm, at the cost of a balanced view.
  • A large majority of people surveyed acknowledged that they withheld information from their doctor about whether they were regularly taking their medication or other important things. They did this because of fear of being judged. “Most people want their doctor to think highly of them,” a researcher concluded.
  • Charles Darwin suffered from personal and professional anxiety, but took comfort in repeating to himself “I have worked as hard and well as I could, and no man can do more than this.” (p. 95)
  • Elon Musk told friends early on that he believed there was only a 10% chance that a SpaceX craft would ever make it into orbit. But he felt it was worth doing anyway. “If something’s important enough you should try,” he later told an interviewer about low odds he had similarly assigned to Tesla’s success. “Even if the probable outcome is failure.” (p. 112)
  • Ben Franklin believed that people were more likely to reject his arguments when he expressed them firmly, using words like “certainly” and “undoubtedly.” So he deliberately prefaced statements with expressions including “If I’m not mistaken….” (p. 123)

Choice quotes:

  • “The first principle is that you must not fool yourself—and you are the easiest person to fool.” —Richard Feynman (p. ix)
  • “A high IQ and an advanced degree might give you an advantage in ideologically neutral domains like solving math problems or figuring out where to invest your money. But they won’t protect you from bias on ideologically charged questions.” (p. 48)
  • “Accepting the possibility of failure in advance is liberating. It makes you bold, not timid. It’s what gives you the courage to take the risks required to achieve something big.” (p. 118)
  • “If you at least start to think in terms of ‘updating’ rather than ‘admitting you were wrong,’ you may find that it takes a lot of the friction out of the process. An update is routine. Low-key. It’s the opposite of an overwrought confession of sin.” (p. 147)
  • “It took me years of writing on the internet to learn what is nearly an iron law of commentary. The better your message makes you feel about yourself, the less likely it is that you are convincing anyone else.”—Megan McArdle (p. 206)

The bottom line is that The Scout Mindset is useful for forcing you to think about the ways that ego and self-deception creep into our thinking. Recognizing that is a helpful start to improving decision making.

You can order The Scout Mindset at or Amazon. (We may make a commission when you buy a book.) All page numbers referenced above are for the hardcover edition.

The handbook for this new era of business doesn’t exist. We’re all drafting our own as we go along—and now we’d like to start doing so together. You can sign up here to receive this briefing by email. You can read all of our book briefings here.