##### For some reason, a great many abstract mathematical concepts align with, and provide significant insight into the way that the world works. Familiarity with these concepts and related language are essential to meaningful, evidence-based, public policy discussions.

A PostPandemic Curriculum is not meant to turn students into polymaths with expertise in a wide range of disciplines. Rather, the goal is to give every student the fundamental concepts and language they’ll need to understand, and meaningfully engage in, complex policy discussions as citizens.

In all discussions of public policy a variety of disparate core concepts interact and reinforce each other (eg. *deductive logical errors* arising out of the use of *statistical averages* due to the common *cognitive distortion of framing bias*). This overlap is why the core “preparation-for-citizenship” concepts should be woven into every element of a PostPandemic Curriculum, rather than segregated by traditional subject disciplines.

**Suggested Core Concepts Related to the Maths:**

**Exponential growth / half-life / doubling-time / S-Curves**: Most complex systems – such as organisms, economies, disease spread, & networks – exhibit behaviours that have strong exponential components.**more**

**Scale / Orders of Magnitude / Fermi questions:**Two important things to know about the numbers in most public policy issues are: “How many zeros?” and “for how many years or people?”.**more**

**Population distribution statistics / standard deviation / median**: “Average” (aka “mean”) is a simplistic concept that is overused. Worse, it leads people to see members of a highly diverse group as all same the same characteristic.**more**

**Logic (Inductive/Deductive/Boolean) / Inference / Causation vs Correlation:**Humans are by our very nature are cause-seekers and pattern-finders. We want certainty and can’t stand loose ends. Our compulsion to make simple sense of the world is so strong that we often fabricate phobias and conspiracy theories out of the thinnest of evidence.**more**

**Computer simulations / algorithms / modelling**: For a few hundred years, mathematicians used mostly precise formulas to make predictions about the world (eg. quadratic equations to accurately put artillery shells onto enemy positions). This worked for a great many problems, but mostly simple physical and chemical ones.*more*

**Risk analysis:**We wear seat-belts even though it is highly unlikely that on any trip we will get into a crash. Take the small effort to buckle-up because we know that even with the low probability of a crash, if one occurs, it can have calamitous consequences.**more**

**Recursion / Iteration / Feedback loops:**Feed-back loops underlie other core concepts like computer simulations, exponential growth, and the scientific method.*more***Game theory:**Game theory is the math of strategic thinking. Classical mathematics (eg. geometry, logic, arithmetic) deals with unconscious objects (eg. shapes, facts, numbers); game theory describes what happens when conscious actors (both human and non-human) adopt various strategies in competitive and collaborative situations.*more*

**Signal vs Noise:**As much as we would prefer otherwise, the world isn’t neat and the data that we gather is contaminated by some amount of randomness (measuring error, unaccounted for factors, etc.). This random “noise” can make it hard to see the underlying pattern – the “signal” – and predict the future.*more***Complexity / Chaos theory / Emergent properties: in progress**

**Some common mathematics-related concepts that I do not think make the list of core concepts ; (though they will still be needed in mathematics curriculum geared to preparing specialists):**

• Trigonometry

• Calculus

• Vectors

• Quadratic equations

**Suggested Core Concepts Related to the Maths:**

**Exponential growth / half-life / doubling-time / S-Curves**:

Most complex systems – such as organisms, economies, disease spread, & networks – exhibit behaviours that have strong exponential components. The familiar “logistic” or “S” curve common in biological, economic, and other systems that are subject to limited resources. The initial explosive growth phase (large exponent, small doubling time) gradually decreases to a flat no-growth steady state (exponent of 0, infinite doubling time) as the growth depletes more and more of the limiting resource. In the case of a pandemic, the limiting resource is people without immunity who are exposed to someone who is infectious. Vaccination, and recovery or death after being infected, reduces the number of susceptible people. Distancing, quarantines, and other protective measures reduce the frequency of exposures. Together these reduce the exponent and increases the doubling time, S-curve behaviour becomes evident. Many simpler important phenomena also exhibit exponential behaviour (eg. the half-life of radioactive decay and the doubling time of compound interest). Unfortunately, our brains have evolved to have an innate intuition of the linear (straight line) equations that rule our day-to-day lives. And thanks to having to cope with gravity, we have an instinctive grasp of the way parabolic/quadratic equations work (when you toss a ball to someone else, or you jump from one place to another, your muscles “know” the quadratic equations for that action).. Linear and quadratic equations rule our day-to-day lives, but relying on our intuition about them to predict large-scale or long-term events can be very misleading.. Our technologies have, for better or for worse, given us the power to alter the world, a world that largely follows exponential rules. If we are to make sense of the world we are affecting, and avoid calamitous mistakes, we must train ourselves to understand exponential growth. We have little intuitive grasp of these longer term patterns, and need to deliberately learn to think in exponential.*…. top***Scale / Orders of Magnitude / Fermi questions:**Two important things to know about the numbers in most public policy issues are: “How many zeros?” and “for how many years or people?” One way people distort public policy discussions is to manipulate the “per” number (denominator). When looking at the various ways that measurements can be presented, the important question is: which measurements most meaningfully contribute to finding an effective policy.

For example, which is a more meaningful statement the relationship between eating a certain food:and dying from a specific type of cancer (both can be true if the base rate of the cancer is 1 in 100,000):

• it will double your chances of dying, or

• it will increase your odds of dying by 1 chance in 100,000.

The action you take about that food will depend largely on which statement you pay attention to, or which statement first makes it into your news-feed – *2 or +.00001.

An “order of magnitude” (OM) is a factor of 10. For example, the distance from your wrist to the tip of your middle finger is roughly one order of magnitude smaller than your height. Incredibly there are just 40 orders of magnitude between the size of a sub-sub-atomic quark to the distance across the observable universe. OMs are most useful when trying to get a sense of how similar or different things are. For example, a human is**roughly**4.5 OMs (ΔOM≅4.5) heavier than an ant, and a blue whale**roughly**3.5 OMs (ΔOM≅3.5) heavier than a human. The ΔOM between an ant and elephant is**roughly**8 (=4.5+3.5).

OMs are useful when there is uncertainty or variability in what you are measuring, but you need still to get some idea of the amount. And they are very useful when comparing things of dramatically different sizes so that you can more easily visualize these differences and resist common cognitive distortions such as availability bias. For example, your height is 10 times more similar to that of Mt. Everest (ΔOM≅4) than the risk of your being killed by a shark is to your dying in a car crash (ΔOM≅5), and yet crazily more people seem to be afraid of sharks than of getting into a car (blame this imbalance on the difference between the available shark images – in*Jaws*and on*Shark Week –*and the available images in ad after ad after ad of serenely safe cars driving along empty roads).*…. top***Population distribution statistics / standard deviation / median**:

“Average” (aka “mean”) is a simplistic concept that is overused. Worse, it leads people to see members of a highly diverse group as all same the same characteristic.

Humans have a tendency, almost a compulsion, to divide things and people into neat arbitrary categories. We then use words that reflect and reinforce this neatness, words that obscure the diversity of the people or things within the arbitrary category. For example, when we speak of a statistic related to the “average” man or woman, or Swede or Thai, that average number takes over. Let’s use height as an example. The accurate (but not very informative) statement*“the average man/Swede is taller than the average woman/Thai”*, inevitably morphs, by dropping the tedious repetition of “the average”, into the misleadingly ambiguous statement*“men/Swedes are taller than women/Thais”*, which too easily lodges in our brains as the patently false idea that*“all men/Swedes are taller than all women/Thais”*. (In fact roughly a third of women are taller than a third of men.)

Stereotypes take hold, and bias follows.

There are three other, equally important basic measurements of a population distribution that are too often ignored:

• mode (the location of the largest concentration),

• median (the point at which 50% of the population has a greater measurement, and 50% a lesser one), and

• standard deviation (how spread out the population is). Concentrating only on the average leads us to serious misunderstandings, particularly when we are comparing to populations. Similarly, two populations with identical averages can be very different, with one being concentrated in a narrow range of values, and the other with a very broad distribution. Or a population can be bi-modal – with two peaks separated by a valley where the average is (the attached male height distribution is slightly bimodal – with peaks at 71″ and 68″, and a valley at the 70″ average).

A clear example of how “average” is confusing comes from US Bureau of Census data on household income. In 2010**the “average” (mean) household income was $67,392. but the income of the “average” (median) family was only $49,276.**The reason can be seen in the attached graph: the earnings of the top 4% of families (the last two bars) add lots of dollars to the total income, driving up the average.

An understanding of population distributions is also important in evaluating risk and identifying effective action. CoVid-19 provides a clear example of this. Looking at a single regional statistic (eg. deaths/confirmed cases, or new cases/day) leaves the false impression that everyone in the region is at the same risk, when in reality in many jurisdictions the cases are concentrated in seniors residences (eg. Quebec), or in a few workplaces (eg. Iowa), or in only a limited geographic area (eg. NY City vs Buffalo). This error can easily lead to actions that over-protect some people (those who have limited contact with others) and leaves other people (such as those who live or work in tightly congregated settings) at great risk. Details matter, and “averages” obscure the details.*…. top***Logic (Inductive/Deductive/Boolean)/ Inference / Causation vs Correlation:**Humans are by our very nature are cause-seekers and pattern-finders. We want certainty and can’t stand loose ends. Our compulsion to make simple sense of the world is so strong that we often fabricate phobias and conspiracy theories out of the thinnest of evidence. But seeing something that isn’t there, and ignoring something that is real, is a good way to make harmful decisions. Our brains evolved to make sense of a slow-changing very local world (a few hundred people living on a few hundred square kilometers). We aren’t naturally equipped to handle the vast concentration of complex, every-changing information that assaults us daily. Which observations are important? What is connected to what? What causes what? How to make sense of anything when our intuition is so fallible? One way to make better sense of this quagmire is bypushing information through the formal processes of logic that were first described 2300 years ago by Aristotle. At the very least, these rules help us to catch errors of thinking that we might otherwise let take root. And sometimes these rules provide us with tools that allow us to improve the world by seeing something for the first time that saves millions of lives based on subtle evidence (eg. John Snow’s identification of cholera as a water-borne disease, overturning the long-held erroneous belief that cholera was caused by foul “miasmic” air). The rules of inference and formal logic help us to carefully separate actual causation from meaningless correlation, and allow us to identify actions that will be effective in achieving a goal.

*…. top***Computer simulations / algorithms / modelling:**For a few hundred years, mathematicians used mostly precise formulas to make predictions about the world (eg. quadratic equations to accurately put artillery shells onto enemy positions). This worked for a great many problems, but mostly simple physical and chemical ones. Other problems, particularly those involving systems with a large number of interacting components, were too complex for this approach. It is only in the past few decades that it has been possible, using computer simulations driven by algorithms that model the interaction between a vast number of components, that we have been able to gain insight into the workings of complex systems (eg. heat transfer between the atmosphere and the oceans, disease spread in a society). But these simulations are just approximations, dependent on both on the programmers understanding of the feedback mechanisms and the quality of the data. Too often simulations are taken as predictions of the future (which they are not), rather than powerful ways to gain insight into the consequences of certain actions (such as burning of fossil fuels on climate patterns, or social distancing on the spread of CoVid-19).*…. top***Risk analysis:**We wear seat-belts even though it is highly unlikely that on any trip we will get into a crash. Take the small effort to buckle-up because we know that even with the low probability of a crash, if one occurs, it can have calamitous consequences. We pay for fire insurance for our homes for the same reason. The repeated cost of small inconvenience is less than the serious harm caused by a rare event. Identifying these rare events, estimating the damage they can cause, and calculating the ongoing costs to avoid or mitigate them is at the heart of risk analysis. The possibility and costs of car crashes and house fires are easy to envision. Most rare events aren’t so easy to predict or cost. Recent notable failures of risk analysis include (*the attached chart refers to the risk associated with project management, but the principles are the same for most activities*):- the
**Volkswagen emissions**cheating scandal (certain benefit of increased current sales due to cheating on the tests vs. cost in fines, lost sales, and reputation in the event of getting caught),*Likelihood**:Moderate + Consequences:Major > Risk:Extreme* - the defective
**Boeing 737 Max**stabilization software design (a small increase in revenue from an optional software upgrade vs. the deaths of 100’s of people and the loss of billions of dollars due to a malfunction of the standard software)*.**Likelihood**:Unlikely + Consequences:Catastrophic > Risk:Extreme,*and - the foreseeable
**CoVid-19 pandemic**(reducing taxes by saving money on yearly public health budgets vs. taking action to reduce the deaths and economic damage of a predictable once in 100-year virulent bacterial or viral infection)*Likelihood:Rare + Consequences:Catastrophic > Risk:High***…. top**

- the
**Recursion / Iteration / Feedback loops:**Feed-back loops underlie other core concepts like computer simulations, exponential growth, and the scientific method. It is also at the heart of realistic computer graphics based on fractals.

Feed-back (aka recursion) is why a speaker squeals when a microphone is held too close to it. The mic picks up a sound, the amplifier makes it louder and sends it to the speaker. Then the mic picks up the sound coming out of the speaker, the amp makes it louder and sends it back to the speaker.*mic > amp > speaker > mic > amp > speaker….*Each time (aka iteration) through the loop, the output (the sound) produced by the last loop becomes the input to the next loop. And because at each trip around the loop the amp makes the sound louder, if the mic is close enough to the speaker the sound goes wild. Either moving the mic away from the output source (speaker) or dialing down the amp reduces (makes quieter) the amount of input (sound) fed into the next loop. The squealing stops when the input/output ratio is less than 1 (negative feedback), and starts again if the ratio becomes greater than 1 (positive feedback).*…. top***Game theory:**

Game theory is the math of strategic thinking. Classical mathematics (eg. geometry, logic, arithmetic) deals with unconscious objects (eg. shapes, facts, numbers); game theory describes what happens when conscious actors (both human and non-human) adopt various strategies in competitive and collaborative situations.

Some of the most important insights provided by game theory deal with how the definition of “winning” (eg. individual vs group benefit) or a subtle change of the rules of a “game” can turn a winning strategy into a losing one, or a losing strategy into a losing one. It also provides insights into how you should act when competing against, or working with, others with a different goal than yours, or when working with allies to achieve the same goal. Techniques developed in Game Theory can help us make sense of “games” as trivial as tic-tac-toe, rock-paper-scissors, and poker, or as critically important as the tragedy of the commons, taxation policies, pandemic response, crime prevention, nuclear war.

It also has provided a window into how cognitive distortions contaminate strategy setting, with intuitions and emotions diverting people from the ways that would maximize the odds of getting what they want.**….top****Signal vs Noise:**As much as we would prefer otherwise, the world isn’t neat, and the all data that we gather is contaminated by some amount of randomness (measurement error, unaccounted for factors, etc.). This irregular “noise” makes it hard to see the underlying pattern – the “signal” – and predict the future. Sometimes separating the signal from the noise is pretty easy (eg. measuring acceleration due to gravity, predicting when the sun will rise, odds of drawing 21 in black-jack). Sometimes it is incredibly difficult (eg. listening to dinner companion in a crowded restaurant, finding a planet orbiting a distant star, forecasting the price of gold, almost anything to do with polling or sociology). Noise is what requires judgement when choosing the type of curve to use when displaying data graphically, what causes annoying static on a distant radio station, what leads to a grainy photo taken in low light (in these cases the signal is: the equation that accurately predicts the future, the song, and the picture)..**….top**