Search This Blog

Saturday, May 10, 2014

Fragility, Not Probability

In my last post I mentioned that I am reading the book Antifragile: Things That Gain From Disorder by Nassim Taleb. I was reading the book today and came across this section that stood out to me:

Fragility, Not Probability

We check people for weapons before they board the plane. Do we believe they are terrorists: True or False? False, as they are not likely to be terrorists (a tiny probability). But we check them nonetheless because we are fragile to terrorism. There is an asymmetry. We are interested in the payoff, and the consequence, or payoff, of the True (that they turn out to be terrorists) is too large and the costs of checking are too low. Do you think the nuclear reactor is likely to explode in the next year? False. Yet you want to behave as if it were True and spend millions on additional safety, because we are fragile to nuclear events. A third example: Do you think that random medicines with harm you? False. Do you ingest these pills? No, no, no.

If you sat with a pencil and jotted down all the decisions you've taken in the past week, or, if you could, over your lifetime, you would realize that almost all of them have had asymmetric payoff, with one side carrying a larger consequence than the other. You decide principally based on fragility, not probability. Or to rephrase, You decide principally on fragility, not so much on True/False.

Let us discuss the idea of insufficient True/False in decision making in the real world, particularly when probabilities are involved. True or False interpretations corresponding to high or low probabilities. Scientists have something called "confidence level"; a result obtained with a 95 percent confidence level means that there is no more than a 5 percent probability of the result being wrong. The idea of course is inapplicable as it ignores the size of the effects, which of course, makes things worse with extreme events. If I tell you that some result is true with 95 percent confidence level, you would be quite satisfied. But what if I told you that the plane was safe with 95 percent confidence level? Even 99 percent confidence level would not do, as a 1 percent probability of a crash would be quite a bit alarming. So to repeat, the probability (hence True/False) does not work in the real world; it is the payoff that matters. 

You may have taken probably a billion decisions in your life. How many times have you compared probabilities? Of course, you may do so in casinos, but not elsewhere.

Antifragil: Things That Gain From Disorder by Nassim Taleb has been quite insightful so far.

-Joe

No comments:

Post a Comment