The Psychology of Decision Making: Behind the Scenes

The Psychology of Decision Making: Behind the Scenes

While training as cardiology fellows, we learn how to interpret data to best care for our patients. We process a range of objective data, from EKGs, hemodynamic measurements to imaging studies. We learn the impressive body of high-quality evidence supporting various treatments.

When we translate this information into decisions, we don't always recognize the process that is occurring behind the scenes in our mind and for our patients. There is an entire field of psychology investigating how we make decisions under uncertainty, and this has important implications for how we practice. Here, we will review a couple of interesting takeaways from the psychology of judgment and decision making, and how it might apply to our work in cardiology.

Psychologists like Daniel Kahneman in his book "Thinking, Fast and Slow," have separated two conceptual models of thinking that explain our decisions, termed System 1 and System 2. These two systems are best explained through examples:

  • System 1. You see a photo of an angry man. You don't need to stop to measure the angles of his eyebrows. In a fast, unconscious, and automatic fashion, you know that the man is angry.
  • System 2. Someone asks you to multiply 81 x 19. You can work it out, but you need to dedicate time and effort. You build on logical principles and slowly calculate the answer.

In this conception of decision making, these two systems are both necessary and work together to deal with the thousands of small and large decisions we make each day. We are typically operating automatically in System 1. When a challenging problem arises, our System 2 kicks in to solve it.

In cardiology, System 1 is crucial to allow for fast processing to quickly make predictions and decisions. We need to be able to recognize an anterior STEMI on EKG and act quickly without stopping to think through the biological mechanisms of ST elevations. We need to be able to see dampening in the pressure waveform in the cath lab and quickly react based on prior experience. Developing these "reflexes" is an important part of fellowship; in fact, situations that initially require effortful deliberation become more automatic as we become more proficient.

While the rapid processing of System 1 thinking is necessary and helpful to deal with the volume of decisions we need to make, it relies on shortcuts and can sometimes lead us astray in predictable ways, called cognitive biases. One example is called the representativeness bias. This refers to the observed pattern that we are prone to judging the likelihood of something based on its similarity to our mental model, or how much it 'represents' our mental model. For example, we may be prone to underestimate the likelihood of acute coronary syndrome (ACS) in a middle-aged woman with epigastric discomfort, because she doesn't fit the textbook mental model of an elderly man with squeezing chest discomfort.

Another cognitive bias is termed availability bias. This refers to the pattern in which a recent or emotionally salient experience that is readily accessible plays an outsized role in a decision, because this information is the most 'available.' A recent experience of missing an aortic dissection which subsequently led to an unfavorable outcome would weigh heavily in a physician's mind. Aortic dissection may be considered more than necessary in the subsequent days and weeks, potentially leading to unnecessary CT scans or delays in diagnosis of other conditions such as ACS.

An understanding of these and many other biases helps to attune us to circumstances when our automatic processing may misguide us, in which it will be helpful to activate System 2 and slow down to think through the problem.

The psychology of decision making can also help us understand patient preferences. Kahneman and his colleague Amos Tversky observed that the models traditionally used by economists to predict choice, in which people were assumed to make rational choices that maximized their expected utility, were poor at explaining actual human behavior. They developed "Prospect Theory", which empirically described how people made decisions. One important observation was that the way a problem is framed has a huge impact – if we describe a 90% chance of survival versus a 10% risk of death (mathematically the same!), the impact on decision making is different. In one study, both patients and physicians viewed lung cancer surgery more favorably when the problem was framed in terms of the probability of living rather than that of dying.

By understanding these patterns in the mental processes, we can improve our medical decision making and ability to support patients in making decisions. The psychology of decision-making and judgment provides helpful language and insight into these processes, and discussion of such concepts can be incorporated into training programs.

Paul Marano, MD
This article was authored by Paul Marano, MD, a fellow at Cedars-Sinai Medical Center. Twitter: @pjmarano1.

This content was developed independently from the content developed for ACC.org. This content was not reviewed by the American College of Cardiology (ACC) for medical accuracy and the content is provided on an "as is" basis. Inclusion on ACC.org does not constitute a guarantee or endorsement by the ACC and ACC makes no warranty that the content is accurate, complete or error-free. The content is not a substitute for personalized medical advice and is not intended to be used as the sole basis for making individualized medical or health-related decisions. Statements or opinions expressed in this content reflect the views of the authors and do not reflect the official policy of ACC.