Gerd Gigerenzer, Director of Harding Center for Risk Literacy at the Max Planck Institute for Human Development in Berlin, has published "Risk Savvy. How to make good decisions" (Penguin 2014). It’s a book that addresses the difficulties of making decisions in uncertain environments and the need to know how to communicate risks in an understandable way. According to the author, according to his own experience, 80% of doctors do not understand the meaning of a positive result in a diagnostic test and, following this line, in an Australian study, from the 50 doctors surveyed, only 13 responded that they understood the concept of "positive predictive value" (which is the probability of having a disease if a specific test is positive), but only one of them finally was able to explain it properly. In a post on this blog, "Too much mammography or the mirage of screenings", Cristina Roure said that a woman's risk of breast cancer after showing a suspicious lesion on a screening mammogram is 10%, when most doctors believed to be 90%. After this introduction, we can understand why we ought to consider Gigerenzer’ book as essential in medical practice in a world of probabilities.
The probability of rain
Speaking of probability, I have chosen a very illustrative example of professional deficiencies in risk communication. Gigerenzer, as you can see him explaining it in the video, says that if the weatherman announces that the probability of rain tomorrow is 30%, there are people who think it will rain 30% of the time, i.e. that there will be 7 hours of rain and 17 hours of good weather, while others believe it will rain in 30% of the territory, and others believe that for each 10 meteorologists, 3 believe that it will rain and 7 believe that it will not. The three interpretations are wrong, because what experts actually mean is to say that for every 10 days with atmospheric circumstances such as those envisaged for tomorrow, during 3 of them it will rain and during the rest of 7 days, it will not. You see, the predictive accuracy is of no use if one doesn’t know how to communicate what is meant in a simple manner.
In this example, the author wants to explain the inability of mathematical models in understanding the future. Gigerenzer cites the writer Nassim Taleb, who asks us to put ourselves into the mindset of a newly hatched turkey. "The first day you see a man approaching and you're afraid he will kill you, but he’s friendly and he actually feeds you. The second day the man comes again, and you’d like to know, according to your experience, what is the probability that he’s going to feed you again. If you knew Laplace’s formula, you’d conclude that the probability is 2/3, and 3/4 the next day and so on because your experience, each passing day, says you can rely more and more on the man coming with the food but the day one hundred since you came into the world finally comes and you’re certain that your protector will bring you food as he did each of the previous ninety-nine days. What a shame that you haven’t paid enough attention to know that this is day one hundred: Thanksgiving Day and the man, against all odds, cuts your throat.”
Medical schools teach statistical certainty, but not the understanding of uncertainty, and even less the ways to communicate the real risk of clinical activities, and then the turkey’s story repeats itself, thinking that things are as they have always been or as someone makes them appear. Trust me and go read Gigerenzer for the sake of those people who trust you.