The Math of Life and Death
When you think of mathematics, what often springs to mind is geometry class, algebra with variables, or maybe the frustration of a long exam.
Rarely do we equate math with matters of life and death. But that’s where Kit Yates takes us in his book “The Math of Life and Death”.
He argues that the hidden mathematics behind everyday decisions, from health care to legal cases to financial planning, can truly determine how we live and how we die.
Yates demonstrates through real-world stories that what may appear as cold numbers often have human consequences.
He says the design of the title is deliberately strong, stating, “Yes, sometimes maths can be a matter of life and death.”
When you see a newspaper headline about a “20 % increase in risk,” or you hear the phrase “your chance of X has doubled,” you shouldn’t just shrug and carry on.
These numbers matter, and misunderstanding them can cost lives.
Seven Mathematical Principles That Shape Our Lives
In the book, Yates introduces seven core ideas drawn from probability, statistics, exponential growth and decay, and optimisation.
Rather than using heavy equations, he uses story after story: a wrongful conviction, a disease outbreak, a financial collapse.
Here are a few favourites and what they teach us.
1. Exponential Growth (and Decay)
Consider factors such as the spread of a virus, interest on investments, or population trends. Something small can grow fast.
And when you misjudge that growth or assume it’s linear when it’s exponential, you get nasty surprises.
Yates uses epidemiology, among other examples, to illustrate how ignorance of this dynamic can become dangerous.
In human behaviour terms, this means we often underestimate how quickly small decisions accumulate, or how quickly an advantage (or threat) can escalate.
2. Probability and Risk
We live in a world of uncertainty. Yates prompts us to ask: What do those percentages actually mean for you?
For example, the difference between relative risk (“Your risk increased by 20 %”) versus absolute risk (“Your risk went from 1 in 10,000 to 1.2 in 10,000”). That ambiguity can be misleading.
And when you hear “there’s a chance you’ll die from X,” what does that really mean?
Are you considering a large population or a specific scenario? Yates shows that poor framing of probability not only misinforms, but it can also harm.
3. Optimisation and Trade-Offs
In life, we constantly make choices: how many hours to sleep, how much to invest, how to allocate resources.
Yates argues that we often assume there’s a “best” choice when in reality we’re trading off one thing against another. The math of optimisation isn’t just nice to know; ignoring those trade-offs can lead us into traps.
In behavioural terms, this means recognising that “more” is not always “better,” and sometimes doing less or diverting attention is smarter.
4. Number Systems and Framing
Numbers don’t speak for themselves. They are framed by us, interpreted by us, and sometimes manipulated by others.
Yates emphasises that the authority we give to numbers can be misplaced: just because there’s a figure doesn’t make it meaningful.
In our daily life, it means we should ask questions: Who collected this number? What assumptions were made? What alternative might there be?
5. Sensitivity to Initial Conditions — The Butterfly Effect
One of the most fascinating insights Yates explores is how minor differences at the outset can lead to significant divergences later on.
This principle originates from chaos theory, also known as the “butterfly effect.”
In mathematics, systems that are highly sensitive to their initial conditions (such as weather patterns, ecosystems, or even economies) can evolve in dramatically different ways, even if they start almost identically.
Think of it like this: two people make nearly identical health or financial decisions, yet years later, one thrives and the other struggles.
Often, it’s not because of willpower or intelligence, but rather due to tiny variations in initial conditions, such as timing, context, opportunity, or luck.
Yates uses examples from biology and medicine to show how a single mutation or early diagnostic error can have enormous consequences.
The takeaway?
Life’s apparent randomness often hides deterministic systems that are just too sensitive and complex for us to predict. Recognising this can make us humbler about our control and more mindful of our small, early choices, because they compound.
6. Conditional Probability — The Hidden Trap in Everyday Thinking
Another key concept Yates stresses is conditional probability; the idea that the likelihood of something depends on whether another event has already happened.
Most people instinctively misjudge this, leading to dangerous misunderstandings.
For example, suppose a medical test for a rare disease is “99 % accurate.” Sounds reassuring, right? But if only 1 in 10,000 people actually have the disease, most positives will still be false.
This is because the test’s accuracy (sensitivity and specificity) interacts with the base rate — the underlying frequency of the disease in the population.
Yates uses this example to illustrate how even professionals, such as doctors, lawyers, and jurors, often fall into what’s called the base rate fallacy or prosecutor’s fallacy.
We forget to account for what’s already known or assumed, and treat conditional probabilities as absolutes.
In human behaviour, this error appears when we overreact to single data points without context: a stock drops and we panic; a medical test flags “abnormal” and we assume disaster.
The fix?
Always ask: Given what? Under what conditions? Numbers without context can be dangerous half-truths.
7. Sampling and Representation — When Averages Lie
The last central theme Yates develops is the mathematics of sampling, or how we draw conclusions from limited data.
He reminds us that the averages we see in studies, polls, and media headlines often mask enormous variation.
Imagine two towns with the same “average income.” In one, everyone earns roughly the same. In the other, one billionaire skews the mean, while most residents struggle to make ends meet. The average hides the truth.
Yates warns that sampling bias (who or what gets measured) can distort everything from political polling to medical trials.
If a drug is tested on 40-year-old men, we shouldn’t assume it works equally well for women or older adults. If a survey excludes non-respondents, it’s already biased toward those willing to speak.
Psychologically, this mirrors our own biases.
Humans create mental “samples” from limited experience: we remember the dramatic example, not the representative one. We judge risk based on what’s most available to memory (the availability heuristic) rather than what’s statistically valid.
The moral?
Question your sample. Whether it’s personal memory or public data, it rarely tells the whole story.
The Human Side: Bias, Fear and Intuition
What I found especially resonant is how Yates links the math to our human foibles.
Our intuitions misfire when the numbers stretch beyond everyday experience. A “doubling” seems dramatic, yet our brains often treat it as an incremental change.
A low probability may seem negligible, yet if the outcome is unfavourable, we should still care.
Yates calls for us to reclaim the power to question the numbers and the power that numbers have over us.
From a behavioural perspective, this aligns with how humans are biased toward simplicity, narrative and certainty even when the world offers none.
Because we dislike uncertainty, we latch on to strong statements (“Your risk is high,” “We can guarantee”) rather than the weak, honest ones (“We don’t know,” “The probability is small but non-zero”).
Recognising that math underpins these statements gives us more agency.
Why It Matters For You
So, why should the reader care?
At first glance, you might think: I’m not a mathematician. I don’t handle epidemics or nuclear reactors. But the truth is, the same math that underpins global crises also runs in the background of your life.
When a doctor gives you a test result, do you understand the risk? Yates’ chapter on medical testing helps you to avoid the “false positive trap.”
When you read a headline about “this increases your chance of cancer,” are you reading absolute or relative figures?
When you invest, save, or borrow, are you aware of growth rates and the compounding effect, whether for gains or losses?
When you make choices for your family or reflect on life and death questions, do you appreciate that what looks intuitive may be mathematically fragile?
Ultimately, math won’t replace your values, ethics, or intuition.
But it can sharpen them. It can help you distinguish between the story somebody tells you and what the numbers actually say. In a world flooded with data, algorithms, and “expert” claims, Yates argues that we need to become sceptical, not cynical, but curious, questioning, and empowered.
One Story that Stuck with Me
Yates recounts a legal case involving what is sometimes called the “prosecutor’s fallacy” – misuse of probability in court that can lead to wrongful convictions.
He uses that to illustrate the cost of failing to interpret numbers correctly; not an academic point, but a very real one.
In life-behaviour terms, consider how easily we fall for the intuitive “that must mean it’s rare” or “the chance is tiny, so don’t worry.”
But if that “tiny chance” applies to something irreversible like death, disability, or ruin, then whether it’s “tiny” or “large” depends entirely on you. That personal relevance is what Yates helps us reclaim.
So What Should We Do Differently?
Here are three behavioural takeaways inspired by the book:
Ask for context. When you hear a risk or a rate, ask: what’s the baseline? What’s the sample size? What assumptions were made?
Think in terms of personal impact. A risk of 1 in 100,000 might feel negligible — unless you’re that “1.” Framing makes a difference.
Be sceptical of certainty. When someone uses numbers to make bold promises (“We will reduce deaths by 50 %!”), ask: Is that relative or absolute? What trade-offs were made? What’s the underlying model?
Yates argues that a little mathematical literacy isn’t about becoming a mathematician; it’s about being better at living.
Conclusion
We humans like stories. We like narrative arcs with heroes and villains, causes and cures. The problem is that many of those stories ignore the mathematics that quietly shapes them.
In The Math of Life and Death, Kit Yates invites us behind the curtain.
He shows us that the numbers are personal. They matter. They carry weight in our health, finances, decisions, and relationships.
So, until next time, when you see a statistic, a percentage, or an exponential graph, pause. Recognise you’re not merely a passive observer; you’re part of the equation.
Dion Le Roux
References
Interalia Magazine, “The Maths of Life and Death – Kit Yates”, 2019.
Kirkus/Simon & Schuster page for The Math of Life and Death by Kit Yates.
The Aperiodical, “Review: The Maths of Life and Death, by Kit Yates”.
FiveBooks, “The Maths of Life and Death – Kit Yates”.
IMA (Institute of Mathematics & its Applications), “The Maths of Life and Death – Kit Yates”.