top of page

Two Easy Pieces


When COVID-19 emerged in early 2020, scientists, particularly those with some background in virology and epidemiology, began applying their knowledge to this never-before-seen situation. True, we were warned by Bill Gates. Also true, there had been pandemics before, but the two most notable in modern times – the 1918 Spanish Flu and AIDS – were different than COVID. The Spanish Flu eventually killed tens of millions worldwide, in part because public education, communications and epidemiological models were not what they are today. Major breakthroughs in epidemiological modelling [1] in the decades following the Spanish Flu provided a basis for understanding subsequent influenza pandemics and the AIDS pandemic.


More than any infectious disease to date, COVID-19 stands alone in the massive mobilization of the mathematical modeling community. Models of all shapes and sizes were poised in early 2020, and within weeks many baseline epidemiological parameters had been estimated for the disease and its causal virus, SARS-CoV-2. But perhaps most singularly, thanks to communication networks – social media, traditional media, and the lightning-fast dissemination of research preprints – information flowed freely, fostering progress. But information also flowed in huge quantities, which, along with inaccurate or poor communication between the scientific community and the public at large – generated confusion and mistrust [2].


COVID-19 made clear that computational technology can go too far in the justification of data-driven, sometimes mind-boggingly complicated models. John von Neumann said “With four parameters I can fit an elephant, and with five I can make him wiggle his trunk”, which could be construed to imply data justifies truth that, in the end, is little more than a compilation of details.


"What a useful thing a pocket-map is!" I remarked.

"That's another thing we've learned from your Nation," said Mein Herr, "map-making. But we've carried it much further than you. What do you consider the largest map that would be really useful?"

"About six inches to the mile" I said.

"Only six inches!" exclaimed Mein Herr. "We very soon got to six yards to the mile. Then we tried a hundred yards to the mile. And then came the grandest idea of all! We actually made a map of the country, on the scale of a mile to the mile!"

"Have you used it much?" I enquired.

Lewis Carroll, Sylvie and Bruno Concluded


Here’s my 2 cents worth. Two central variables derived from the most basic epidemiological models go a long way towards understanding COVID-19. They are: the effective reproduction number Reff and the fraction of infectiousness in the non-immune population I. True, Reff can be decomposed into further parameters, foremost among them being the basic reproduction number R0, the immune fraction of the population, and contact network heterogeneity. True too, the baseline epidemiological model has other important parameters (e.g., incubation and infectious periods, fatality ratio), but they wind-up influencing one or both of Reff and I.


In the early stages of an epidemic when the susceptible fraction is large, growth is Reff and the population burden is the simply the product of I and Reff. The implications of products of this form are general beyond epidemiological processes. I cringe when I hear “this is so important because its growing by X%”. No! Or rather, yes, it can be. 100% interest on $1 is a pittance compared to the same on $1M. 0.001% on the latter is still far more lucrative than 100% on the former. Two pieces are needed: growth rate and capital [3]. Same logic goes for COVID-19.


So, this is what you need to know [4]:



Data-driven statistical models now routinely estimate R0 and Reff. RCeff is the reduction in Reff due to measures such as social distancing and lockdowns. Low numbers of infectious cases “buy time” in exponential growth. Higher numbers mean that heath services are stressed and risk collapse – lockdowns are necessary. Capping RCeff at 1.0 makes most sense if active case numbers are low (since otherwise although the curve is flattened, new case numbers remain high). But capping at low case numbers is problematic, because it stymies the growth of natural immunity. Capping is also a difficult sell for governments, whose constituents see that freedoms remain limited despite the virus apparently being under control.


Thus, importantly, there is a tradeoff between minimizing morbidity and mortality and negative externalities of the disease to individuals and society [5]. A recent study of optimal COVID-19 control is consistent with the above schema [6], and in particular the transient endpoint at RCeff=1 [7]. But, RCeff=1 is unsustainable due either to the long waiting time to achieve herd immunity when case numbers are low and the difficult sell of socio-economic costs, or high case numbers and associated over-burdened health systems, morbidity and mortality, and socioeconomic impacts.


Vaccination in conjunction with natural immunity now promises the exit from RCeff=1 in many countries, to situations of local virus endemicity or even extinction...at least until the likely emergence of escape variants. The question then is how we will be prepared to prevent new outbreaks and obviate the pandemic status of new COVID-19 variants.


Data driven, computationally complex models are essential for determining what?, when? and how much? in decision making. However, we shouldn’t be blinkered into thinking that their “truth” makes them invariably superior to coarse-grained models and toy models. Due to the importance of how science is communicated and the centrality of collective behavior in both preventing the spread of SARS-CoV-2 and in vaccination campaigns, I believe that simple models will prove pivotal in vanquishing the COVID-19 pandemic.

[1] Brauer F., Castillo-Chavez C., Feng Z. (2019) Introduction: A Prelude to Mathematical Epidemiology. In: Mathematical Models in Epidemiology. Texts in Applied Mathematics, vol 69. Springer, New York, NY. https://doi.org/10.1007/978-1-4939-9828-9_1

[2] https://mehochberg.wixsite.com/blog/post/covid-19-in-the-information-commons

[3] Famously the basis of Thomas Piketty’s 2014 book Capital in the Twenty-First Century.

[4] This schema shows the procession from initial attempts to mitigate outbreaks, either to optimization if successful, or suppression if unsuccessful. Once through a cycle, subsequent strategies will depend on active case numbers (and the correlated impact on health systems). Low case numbers more likely to err towards baseline physical distancing and self-isolating, whereas high numbers met with more restrictive curfews and lockdowns. What is “low” and “high” and how “mitigation” is demarcated from “suppression” is somewhat arbitrary. The former depends how decision-makers decide on thresholds of action, whereas the latter are the packages of measures considered sufficient to either slow or reverse growth.

[5] Prioritizing the former as so many countries have, sacrifices the latter, and as it turns out, also generates negative externalities on social welfare, psychology, and the economy.

[6] Li G., Shivam S., Hochberg M.E., Wardi Y., Weitz J.S. (2020) Disease-Dependent Interaction Policies to Support Health and Economic Outcomes During the COVID-19 Epidemic. Available at SSRN http://dx.doi.org/10.2139/ssrn.3709833

[7] Sofonea M.T., Boennec C., Michalakis Y., Alizon S. (2021) Two waves and a high tide: the COVID-19 epidemic in France. Anaesth Crit Care Pain Med. 40:100881. https://doi.org/10.1016/j.accpm.2021.100881


Comments


bottom of page