TLDR: Nassim Nicholas Taleb on the Pandemic
A summary of the July 27, 2020 episode of EconTalk.
This is a summary of the most recent EconTalk episode, ‘Nassim Nicholas Taleb on the Pandemic’. Well worth the listen. The relevant papers discussed in the episode:
‘Systemic Risk of Pandemic via Novel Pathogens – Coronavirus: A Note’ — Joseph Norman, Yaneer Bar-Yam, Nassim Nicholas Taleb.
‘Ethics of Precaution: Individual and Systemic Risk’ — Nassim Nicholas Taleb and Joseph Norman.
‘On single point forecasts for fat tailed variables’ — Nassim Nicholas Taleb
If you have the time, go read those instead.
I also wrote a companion to this piece, an explainer on fat tails — in case you don’t know what those are:
On Societal Risk
There’s a big difference between risks that simply lead to different outcomes and risks of ruin, particularly on the systemic level. We should be worrying about multiplicative risks — such as pandemics. On the other hand, car accidents are not a societal risk of ruin, as car accidents don’t lead to other car accidents.
If you found out that 1 billion people died in a single year, and didn’t know how, your guess wouldn’t be car accidents. It would be something fat-tailed like nuclear war or pandemics.
It’s worthwhile figuring out what the systemic risks that we should be avoiding are — it liberates us and allows us to take lots of risks elsewhere.
On Personal Risk
If you don’t behave conservatively, you’ll increase collective risk dramatically because risk due to pandemics doesn’t scale linearly. You wear a mask more for the systemic effect, not to mitigate personal risk.
Prudence on the individual level may seem like ‘overreacting’, and it would be ‘rational’ not to overreact. However, it’s important to note that rationality doesn’t scale; what’s rational for the collective may seem irrational for you personally. People doing the right thing will look irrational.
How to Deal With Pandemics
Any infectious disease with over 1000 deaths can be considered a pandemic. If the count is below that, you don’t have to worry about it. If above, it means you’re dealing with a fat-tailed event. Treat all pandemics the same way — the moment they kill 1000, take measures.
The most effective way to prevent pandemics is to do systemic quarantine. Follow a protocol and don’t take chances — it was foolish to quarantine people only coming from China, as the virus could have came from anywhere (and it did). Reduce connectivity. Close borders. You don’t need cases at 0, just make sure that the cases don’t overwhelm your system.
Identify superspreaders. Subways, elevators, big gatherings, things like that. Do this for all pandemics, no matter how impactful, until we figure out the specific properties of the one we’re dealing with.
Absence of Evidence ≠ Evidence of Absence
For example, if you have no evidence of cases, it doesn’t mean you have no cases. Or if you have no evidence that masks work, it doesn’t mean that masks don’t work.
Err on the side of prudence when dealing with risks of ruin.
“If you don’t know if masks work, wear them.”
The central idea of the Incerto is: when you have uncertainty in a system, it makes your decision making much much easier rather than harder.
“If I tell you that I’m not certain about the quality of this water, would you drink it?”
“If I tell you that we have uncertainty about the pilot’s skills — he could be excellent, but we’re not sure — would you get on the plane?”
Single Point Forecasts
“Science is about understanding properties, not forecasting single outcomes: [Figure 1 shows] the extent of the problem of forecasting under fat tails. Most of the information is away from the center of the distribution.” 
Using Gaussian tools in a fat-tailed domain is doing it wrong.
In addition, if you have absence of evidence, it’s still scientific for you to describe a process — you can still understand the properties of something even if you don’t get the details exactly right.
“What an old trader told me: Take all the risks you want, but make sure you're in tomorrow.”
In other words, avoid risk of ruin.
Under uncertainty, behave according to a protocol. Only hindsight is 20/20.
The average of past pandemics is not a good predictor of the impact of the next one because, again, fat tails.
Initially, WHO, CDC, and others said not to wear masks. The WHO made two mistakes. First, they didn’t realize scaling: if the probability of infection is p, if both people wear masks it becomes p squared. For example, if p=0.50, both people wearing a mask would lower p to 0.25. Second mistake: if I reduce the viral load by half, I don’t decrease probability of infection by half — I may decrease it by 99%. That’s because the probability of infection is nonlinear — it’s an S-curve.
In addition, they lied because they were worried about a mask shortage.
People’s instincts were much better than what the WHO, CDC, etc advised.
“All of these people are completely incompetent when it comes to basic things that your grandmother gets.”
Have the WHO removed — it’s a bureaucratic organization that has been harmful to mankind by telling people not to wear masks.
If you liked this post, feel free to share it with your friends! If you have any feedback or if I got anything wrong, please let me know!