(For this blog I am drawing on Michael Lewis, in the Undoing Project, his book on Daniel Kahneman – pp. 181-211).

Last week, I dealt with cognitive biases and hence the need to follow a methodical approach in decision making. This week, I want to discuss another aspect of Daniel Kahneman and Amos Tversky’s research: the fact that people generally make the same types of mistakes when assessing the odds of a particular outcome. These mistakes occur across different situations. 

For example, when asked to guess what a little boy would do for a living when he grew up, if he matched a mental picture of a scientist, most people guessed he’d be a scientist – and neglected the prior odds of any kid becoming a scientist.  Or when guessing a person’s current career based on a personality sketch, people would leap form a similarity judgment (“that guy sounds like a computer scientist!”) to even predictions (“that guy must be a computer scientist!”) and ignore both the base rate (only 7 percent of all graduate students were computer scientists) and the dubious reliability of the character sketch. 

It may come as a surprise to mekarvim, but, in our field, we are always dealing with the rare situation. What we are doing is not well studied. We have no statistically valid samples that are large enough and controlled for different variables. We don’t compare our information against control groups. We make assessments based on the students who come to our programs as opposed to those who don’t. We profile the interest of the average student based on those we meet, as opposed to trying to get focus groups that are more representative of the broader population together.

We are not alone in our errors. Kahneman and Tversky discovered that statistics students do not naturally internalize the importance of base rate, for instance. They were as likely to draw big conclusions from a small sample as from a big sample.

Even experts are likely to make decisions that ignore simple algorithms which are based on their own expert knowledge. Although they agreed upon the criteria for interpreting an x-ray for the occurrence of cancer, expert doctors disagreed with each other. More surprisingly, when presented with duplicates of the same ulcer (without knowing this), every doctor contradicted his diagnosis.  A simple algorithm outperformed not merely the group of doctors; it outperformed even the single best doctor. 

In this and other cases, it was discovered that we tend to reach conclusions by using rules of thumb and not by assessing actual possibilities. Very often these are rare situations that we are not well informed about, so we resort to stereotypes or mental pictures that predispose us to a certain decision. We also might draw on an incident that we can recall with special ease, which then has disproportional weight when making a judgement. This leads us not only to making random mistakes, but rather to being systematically wrong. 

We have a kind of stereotype of “randomness” that differs from true randomness. We are blinded by assumptions that don’t allow us to consider the likelihood of the true scenario. For example, Londoners in the Second World War thought that German bombs were targeted, because some parts of the city were hit repeatedly while others were not hit at all. In fact, the distribution was exactly what you would expect from random bombardment. 

Similarly, people find it a remarkable coincidence when two students in the same classroom share a birthday, when in fact there is a better than an even chance in any group of twenty-three people that two of its members will be born on the same day.  

There are many heuristics (mental tools) that lead us astray, as I mentioned in my last article. The more complicated the real life problem, the more we trigger our memories to make up a narrative that effectively replaces probability judgments.  Should we cancel a Shabbaton if there is not enough of a sign-up by Sunday? Or do we say, “We have always run this Shabbaton, people will surely sign up later in the week?” What about on Tuesday? The idea of failure may be so unlikely in our minds that we fail to imagine the events, or changes in our population, that could cause it to occur. The defect then is in our imagination. This causes us to make changes later than earlier, when the persistent lack of success leaves us with no other choice. But that’s no way to run an organization.

______________________

Rabbi Avraham Edelstein is the Education Director of Neve Yerushalayim College for Women and a senior advisor to Olami. Many of Rabbi Edelstein’s foundational publications addressing the world of Kiruv appear on OlamiResources.com: Series on Kiruv and Chinuch, Commentary on Chumash and Yom Tovim, The Laws of Outreach, as well as contributing articles.  

Leave a Reply

  • (will not be published)