Monday 22 January 2018

The paternalism of presenting the glass half full or half empty

Pedro Rey



In previous entries, both Cristina Roure and Jordi Varela have talked about how cognitive biases and, in particular, our difficulty in understanding what probabilistic calculations really mean, can affect important decisions about our health. Today I want to show you an example, originally due to psychologists Daniel Kahneman and Amos Tversky, about the importance of the way in which health information is presented in cases in which it is necessary to make a clinical decision, given that there is uncertainty and thus, there exist different possible options which may not offer certain results. In order to better understand the example here presented, I suggest that after reading the following paragraph, you stop for a second to think and decide, before moving on to read the paragraph that follows.

“Imagine that you are a health manager who must decide between two possible measures in the face of the outbreak of an epidemic that is expected to kill 600 people. The information you have about the consequences of the two measures you should choose is as follows: if you decide to take measure A, you know with certainty that 200 people will be saved. If you decide to take measure B there is a 1/3 chance that the 600 people will be saved (and therefore, a 2/3 chance that no one will be saved). Which of the two measures would you choose?”  Please take a moment to think about it and write it on a piece of paper before continuing reading.

Imagine now that, faced with the same epidemic, you must choose between these two other measures. If you choose measure C, 400 people will die. If you choose measure D, there is a 1/3 chance that no one will die (and a 2/3 chance that 600 people will die). Which of the two measures would you decide now?

Having read the two paragraphs, you have probably already realized that there exists contradiction: measures A and C are identical in their expected consequences, as are measures B and D. However, it’s likely that you, like 72% of the subjects of multiple experiments who are asked to decide between A and B, may have chosen A, while, like 78% of the subjects who are asked to choose between C and D, you may have chosen D in the second question. How can this inversion of preferences occur?

Kahneman and Tversky, based in evidence from simple experiments like the one I have shown you, are responsible for the so-called "prospective theory", which offers an explanation. Summarizing it briefly, the theory says that human beings suffer more from negative events than what they enjoy from positive events, which leads us to behave as risk-averse when dealing with positive outcomes, and instead behaving like risk-lovers when faced with events which may have negative consequences. When we must choose between A and B many of us value more the certainty of saving 200 lives with measure A than taking the risk inherent to measure B, which with a low probability will save even more people. However, when it comes to assuming deaths, i.e., when choosing between C and D, we feel better when taking measure D (equivalent to B), which with low probability can achieve not deaths, than when taking measure C (equivalent to A) which assures us that we’ll have to take responsibility for the death of 200 people.

The problem presented by this example is not so much that it demonstrates that human beings are contradictory, which barely surprises us anymore, but that it opens the door for us to be manipulated when making decisions, merely by how the data is presented to us. This manipulation capacity is of particular importance in clinical practice where, for example, in an environment in which an attempt is made to promote shared decision making between doctor and patient about which treatment to follow, the doctor can continue to exercise full control over the patient through presenting the information of the healing possibilities or possible side effects in a positive or negative way. Therefore, it’s important to recognize that, if you really want to favor freedom of choice in situations that by definition involve risks, and thus probabilities, it’s necessary either to move towards greater education of those who receive the information so that they are able to interpret it correctly being aware of their own cognitive biases, or to do an enormous exercise of honesty and exposing this type of mind tricks, dedicating enough time to helping others  understand in an objective way, and not biased by our own self-interest, the expected consequences of their decisions, in some cases literally of life or death;  Give me freedom of choice or paternalism based on the specialists’ expertise but don’t disguise one as the other.

No comments:

Post a Comment