# Covid, Forecasting, and the Problem with the Precautionary Principle

I came across this My Final Case Against Superforecasting

Although the post is a year old and a lot has transpired regarding Covid since, it’s still interesting and relevant. The author argues that superforecasters overlook the ‘keys’ that are in the dark, only focusing on easy predictions (the keys under the streetlight):

All of this would appear to heavily incline superforecasting towards the streetlight effect, where the old drunk looks for his keys under the streetlight, not because that’s where he lost them, but because that’s where the light is the best. Now to be fair, it’s not a perfect analogy. With respect to superforecasting there are actually lots of useful keys under the streetlight, and the superforecasters are very good at finding them. But based on everything I have already said, it would appear that all of the really important keys are out there in the dark, and as long as superforecasters are finding keys under the streetlight what inducement do they have to venture out into the shadows looking for keys? No one is arguing that the superforecasters aren’t good, but this is one of those cases where the good is the enemy of the best. Or more precisely it makes the uncommon the enemy of the rare.

Second, that brier scores and other metrics of accuracy ignore ‘expected value’ and the consequences of large but infrequent losses. For example, a poker player may win 99% of the time, but the 1% that he or she loses negates all of the earlier winnings, and then some. Just knowing the odds of something happening tells you nothing about the downside of being wrong. Probabilities are quantitative, but outcomes are qualitative.

Covid was a hidden key. So was the 2008 financial crisis, or the possibility of nuclear war. Forecasters have trouble with these, owing to the tiny probabilities and infrequency of such events.

To add my own commentary, regarding Covid, everyone got it wrong, myself included. Some got it more wrong than others, but overall, everyone was wrong, either generally being too optimistic or too pessimistic.

Bill Gates may have predicted a pandemic, but predicting the same thing over and over, and then only being right by chance (so-called broken clock), is more luck than skill. Moreover, past outbreaks such as Sars, Ebola, Bird Flu, and Swine Flu had only a minimal or negligible economic and societal impact, so it was not that unreasonable to assume Covid would play out the same. Many individuals who predicted that Covid would be bad, still failed to predict the shutdowns to follow. Nor did they foresee that the IFR, initially estimated at 1% based on data from China, would be be revised to as low as .1-.2% (and cosiderably lower for young and middle-aged people), nor did they predict the v-shaped recovery in the US economy and stock market.

Others were too optimistic early on, like myself, but were right about the v-shaped recovery in the stock market and economy. Despite a lot of deaths, I surmised that the economic toll would be minimal because it was disproportionately the elderly and people with a comorbidity (and low-productive people, in general) who were getting badly sick or dying. Moreover, the mortality rate of Covid was comparable to WW1 or WW2, but despite hundreds of thousands of healthy, able-bodied Americans in their prime productive years perishing in those wars, the US economy was unscathed and the market did not fall much. Additionally, I realized, by virtue of population growth, that all of the Covid deaths, even under the most pessimistic of scenarios [1], would be replenished within just 4 months, so such loss of life, although tragic, is replaceable, and is why I was optimistic about the US economy. Even the Spanish Flu, which was way worse than Covid, preceded the roaring 20s.

From the April 2020 post (which coincided with the bottom of the stock market) The great write-off:

As many as 1 million Americans dying in a single year of Covid-19 may seem catastrophic for the economy, but consider that the US population historically grows at 1%/year, so the lost GDP is replaced in just 4 months. Such economic loss is also offset also to some degree by the fact that the elderly are past peak productive and reproductive years and that their wealth will go to their heirs, who will spent it on homes and such. Deaths will also reduce social security and other spending such as medicare. By comparison, WW2 cost 420k US lives (or about 1.02 million adjusted for today’s population) and considerably more injuries and permanent disabilities, but the US economy boomed afterwards. The void was quickly filled by new population growth and economic activity, and things soon returned to normal.

The author suggests that cheap, preventative measures, such as 3M stockpiling masks, would have helped, and blames the shortsightedness of policy makers:

As Taleb points out stockpiling reserves of necessities blunts the impact of most crises. Not only that, but even preparation for rare events ends up being pretty cheap when compared to what we’re willing to spend once the crisis hits. As I pointed out in a previous post, we seem to be willing to spend trillions of dollars once the crisis hits, but we won’t spend a few million to prepare for crises in advance.

Of course as I pointed at at the beginning having reserves is not something the modern world is great at. Because reserves are not efficient. Which is why the modern world is generally on the other side of Taleb’s statement, in debt and trying to ensure/increase the accuracy of their predictions. Does this last part not exactly describe the goal of superforecasting? I’m not saying it can’t be used in service of identifying what things to hold in reserve or what rare events to prepare for I’m saying that it will be used far more often in the opposite way, in a quest for additional efficiencies and as a consequence greater fragility.

To say ‘spending a few million’ could have prevented the spread of Covid, is just so preposterous and such wishful thinking that it almost does not even merit a response. Based on what does he assume this? But I am skeptical such prevention would have made much of a difference, owing to the extreme virulence of Covid, which ran rampant even in countries and cities that had strict lockdowns and high rates of mask compliance.

The major problem with the precautionary principle (such as the old adage ‘an ounce prevention is worth a pound of cure’ or ‘better safe than sorry’) is that there are seemingly endless things that can go wrong, so although a single preventative measure is cheap, many measures can quickly become very expensive. Insurance companies make so much money because of this. Given all the things that can conceivably go wrong (the ‘known unknowns’), it’s impossible and impractical to ever be prepared for everything. Stockpiling masks and running virus simulations may help in pandemic, but how does one prepare for something which cannot even predicted and is entirely unforeseen (the ‘unknown unknowns’) until after the fact? Also, how do we even know what measures will work or how effective they will be? How do we know if our ounce of prevention will yield a pound of cure or just a few ounces?

From the sidelines on Twitter, it’s easy for Taleb to claim the moral high ground by saying that forecasters failed predict Covid or underestimated its severity, but just saying that one ought to have robust (or as he sometimes also calls ‘antifragile’) measures/systems (whatever those may be) in place to prevent crisis, is not that helpful if one cannot even ascertain ahead of time what those preventative measures are or how effective they would be.

Yes, in a perfect world and with perfect hindsight one would be able to stave-off a potential trillion-dollar crisis with just a billion dollars or so, but that is as unrealistic as assuming there will never be crisis at all. The lesson of Covid is that rather than trying to predict the future or prevent crisis or having to live in fear of the unknown, that similar to the 2008 financial crisis, it is more practical or realistic to have systems in place, such as via aggressive fiscal and monetary policy (and also learning the necessary lessons to prevent history from repeating), to limit the potential damage of crisis when it arises, like by preventing a $1 trillion crisis from becoming a$10 trillion crisis.

[1] Even if we assume the counterfactual of no preventative measures and letting nature run its course