Here is something for the math-oriented to pass the time:
Chance of staying well until herd immunity
Is it possible to determine the _relative_ chances of avoiding infection until herd immunity is reached? ie odds of success if herd immunity is at 40% instead of worst cast (say 80%). In other words, a dimensionless Risk Factor.
We don't know actual "time" in days to reach Herd Immunity. But maybe we can drop out "time" by making other factors relative to each other.
The time you have spent actively avoiding infection is either an investment or a sunk cost. Might the way you view it determine how you write the equation?
To remain well, you have to win a coin flip every day. But we don't know the chance of failure per day, so you can't use the simple equation of "success = 1 minus the (cumulative) chance of losing."
But it is also like radioactive decay. Which is the same as compound interest, only the Half Life has a negative sign, and compound interest Doubling Time has a positive sign.
Both of those can be re-written as "percentage per day." - except we don't know the actual value of that number.
If we set a worst case for "fraction to achieve herd immunity" - say at 80%,
then is it possible to figure out that "the chance of staying well until herd immunity is reached, IF herd immunity is actually reached at 40%, is 5 times better than if it is at 80%."
We won't know how many days 5x better adds up to, but its a dimensionless Risk Factor.
You can graph this and get a decay curve -- the chance of staying well long enough becomes smaller, as the actual % needed for Herd Immunity rises. Can we get a curve whose shape informs us of something useful?
A more sophisticated equation might use not only the worst case % for herd immunity, but the fact that "today" we have not achieved it, and the reasonably-estimated fraction who have been infected is (whatever percent). So you would have a lower bound as well as the upper bound to constrain the curve.
Herd Immunity is a function of both % of population, and time to reach that % (but we don't know the number of days).
Can "time" be dropped out of the equation or normalized to equal 1, so everything else is relative?
Another factor which drops out is your Personal Avoidance Factor. You know that staying home is better than being a dentist. But it won't change the _relative_ difference for _you_ between Herd Immunity At 40% vs 80%.
The unknown _absolute_ difference will be quite large for stay-home vs dentist, at every value of "Actual % For Herd Immunity." Those would be different curves that stack atop each other (shape is the same).
There are a lot of Factors, but I wonder if it can be structured to answer this particular question in relative terms.
EDIT: and of course the "Actual % For Herd Immunity" is a function of R0. Something that epidemiologist blogger Trevor Bedford said seems to indicate it is a linear correlation, if it was quoted right. ie reduction of R0 by a given percentage reduces the Herd Immunity number by the same percent. Not sure about this though.
Chance of staying well until herd immunity
Is it possible to determine the _relative_ chances of avoiding infection until herd immunity is reached? ie odds of success if herd immunity is at 40% instead of worst cast (say 80%). In other words, a dimensionless Risk Factor.
We don't know actual "time" in days to reach Herd Immunity. But maybe we can drop out "time" by making other factors relative to each other.
The time you have spent actively avoiding infection is either an investment or a sunk cost. Might the way you view it determine how you write the equation?
To remain well, you have to win a coin flip every day. But we don't know the chance of failure per day, so you can't use the simple equation of "success = 1 minus the (cumulative) chance of losing."
But it is also like radioactive decay. Which is the same as compound interest, only the Half Life has a negative sign, and compound interest Doubling Time has a positive sign.
Both of those can be re-written as "percentage per day." - except we don't know the actual value of that number.
If we set a worst case for "fraction to achieve herd immunity" - say at 80%,
then is it possible to figure out that "the chance of staying well until herd immunity is reached, IF herd immunity is actually reached at 40%, is 5 times better than if it is at 80%."
We won't know how many days 5x better adds up to, but its a dimensionless Risk Factor.
You can graph this and get a decay curve -- the chance of staying well long enough becomes smaller, as the actual % needed for Herd Immunity rises. Can we get a curve whose shape informs us of something useful?
A more sophisticated equation might use not only the worst case % for herd immunity, but the fact that "today" we have not achieved it, and the reasonably-estimated fraction who have been infected is (whatever percent). So you would have a lower bound as well as the upper bound to constrain the curve.
Herd Immunity is a function of both % of population, and time to reach that % (but we don't know the number of days).
Can "time" be dropped out of the equation or normalized to equal 1, so everything else is relative?
Another factor which drops out is your Personal Avoidance Factor. You know that staying home is better than being a dentist. But it won't change the _relative_ difference for _you_ between Herd Immunity At 40% vs 80%.
The unknown _absolute_ difference will be quite large for stay-home vs dentist, at every value of "Actual % For Herd Immunity." Those would be different curves that stack atop each other (shape is the same).
There are a lot of Factors, but I wonder if it can be structured to answer this particular question in relative terms.
EDIT: and of course the "Actual % For Herd Immunity" is a function of R0. Something that epidemiologist blogger Trevor Bedford said seems to indicate it is a linear correlation, if it was quoted right. ie reduction of R0 by a given percentage reduces the Herd Immunity number by the same percent. Not sure about this though.
Last edited: