In our previous episodes, we examined the velocity of planetary change and the vast mechanical infrastructure driving that acceleration. Today, we confront a more uncomfortable reality: The human mind attempting to comprehend these scales is fundamentally the same organ that evolved to track perhaps thirty members of a nomadic tribe, identify edible plants, and detect predators in tall grass.
We have created planetary-scale problems for ourselves, using cognitive hardware that evolved for village-scale solutions. And most humans never receive the software updates — the education, training, and cognitive restructuring — needed to even comprehend the crisis, let alone respond to it.
This is not a moral failing. This is not about virtue. This is about the neurological reality of what we are: temporarily educated Stone Age brains attempting to operate industrial civilization.
SECTION 1: THE SOPHISTICATION ILLUSION
Let us begin with an uncomfortable fact: You are genetically identical to your ancestors from 100,000 years ago. Not similar — identical. If we could transport a newborn from a Paleolithic tribe to modern suburbia, raised with modern education, they would be indistinguishable from any contemporary human. They would use smartphones, struggle with calculus, worry about their social media presence.
Conversely — and this is the critical point — if you had been born into that Paleolithic tribe, you would believe that thunder was the anger of sky spirits. You would explain illness through possession or moral failing. You would live, think, and die with a mind no different from those ancestors we contemptuously classify as primitive.
Consider the colonial-era fantasy epitomized by Edgar Rice Burroughs' Tarzan novels, which posited the exact opposite of this neurological reality. In Burroughs' imagination, a white English noble infant, raised by apes without human language or education, would somehow "naturally" develop superior intelligence, invent tools, teach himself to read from found books, and rise to rule over both animals and African peoples. This wasn't just fiction — it was a widely-believed racial mythology that shaped colonial policy and educational theory for decades.
The Tarzan mythos represents perhaps the most pernicious form of the sophistication illusion: The belief that cognitive advancement is genetic rather than cultural, that "civilized" people carry superior mental hardware rather than merely updated software. In reality, that fictional Lord Greystoke, raised by apes, would never have developed language beyond grunts and gestures. He would have eaten with his hands, shown no conception of abstract thought, and possessed no more capacity for leadership than his ape siblings.
The actual documented cases of feral children demolish the Tarzan fantasy completely — they show us that without cultural transmission, without education, without the software updates of human society, we are simply primates with unused potential. The colonial project justified itself (cynically, with the usual human hypocrisy and willful blindness) through this biological determinism, claiming that European dominance proved evolutionary superiority, when it actually demonstrated nothing more than the temporary advantages of accumulated cultural knowledge and military technology — advantages that any population could acquire through education, as history has repeatedly proven.
Consider Genie, discovered in Los Angeles in 1970 at age thirteen. She had been isolated in a single room since infancy, strapped to a potty chair, forbidden to make noise. When found, she could not speak. She could not walk normally. She moved like any abused animal would, sniffing objects, unable to focus her eyes beyond a few feet. Despite years of intensive intervention, she never developed normal language or cognitive function.
Or examine Oxana Malaya, found in Ukraine in 1991, having lived with dogs from age three to eight. When discovered, she ran on all fours, barked, and showed her tongue when hot. She had lost human language entirely, communicating through barks and gestures. Even after decades of rehabilitation, her mental capacity remains that of a young child.
These are not anomalies. They are demonstrations of what humans actually are without cultural transmission: Primates with potential. Every capacity you consider fundamentally human — language, abstract reasoning, mathematical thinking, moral reasoning — must be installed through education. None of it is inherent. For hundreds of thousands of years we lived as apes. A couple centuries of scientific thinking later, we pat ourselves endlessly on the back and talk about us “versus” the animals…
This brings us to the Flynn Effect, one of the most revealing phenomena in cognitive science. Throughout the 20th century, IQ scores rose consistently — approximately 3 points per decade in developed nations. This wasn't evolution; human genetics don't change that quickly. It was the result of improved nutrition, universal education, and increasingly abstract thinking demands in daily life.
A predictable development: The Flynn Effect has reversed. Since the 1990s, IQ scores have been declining in Norway, Denmark, Australia, Britain, Netherlands, Sweden, and Finland. We are not getting "smarter." We were temporarily trained to think in ways that scored higher on specific tests, we took it for granted, and now we're losing even that training.
The implications are staggering. Every generation believes it represents the pinnacle of human development, but we are merely the same primates with temporarily updated software. And that software is not being maintained.
Consider how this manifests in our relationship with science and technology. During World War II, Pacific islanders watched American military forces build airstrips, control towers, and radio equipment. Planes would land, delivering vast quantities of manufactured goods. After the war ended and the bases were abandoned, the islanders built their own bamboo control towers, coconut headphones, and straw airplanes. They performed the same rituals they had observed — marching in formation, standing at bamboo "radios" — expecting cargo to arrive.
We laugh at these cargo cults, but we perform identical behaviors with technology we don't understand. We press buttons on machines whose operations are complete mysteries to us. We trust "algorithms" we cannot explain. We invoke "quantum" as a magical explanation for phenomena we don't comprehend. We are cargo cultists with smartphones.
The Wikipedia Delusion compounds this problem. Studies show that when people have access to search engines, they rate their own intelligence higher than when they don't — even when they don't actually search for anything. The mere possibility of accessing information makes us feel smarter. We conflate the internet's knowledge with our own understanding.
This false confidence scales catastrophically through the Dunning-Kruger effect, the cognitive bias where people with limited knowledge in a domain vastly overestimate their competence precisely because they lack the expertise to recognize their own ignorance. Dunning and Kruger's original studies found that people scoring in the 12th percentile estimated themselves to be in the 62nd percentile. This isn't arrogance — it's neurological. The same knowledge structures needed to perform well in a domain are required to evaluate performance in that domain. Without expertise, you literally cannot see your own incompetence.
The implications for climate comprehension are devastating: The people most confident in their climate opinions are often those with the least actual understanding of atmospheric physics, statistical analysis, or systems dynamics. A person who has never studied feedback loops literally cannot recognize that they don't understand feedback loops. Someone who has never grasped exponential mathematics doesn't know what they're missing when they evaluate CO2 accumulation. The very knowledge required to understand climate change is also required to understand that you don't understand climate change.
This creates a democratic crisis where those with the strongest opinions — who vote, protest, get elected in popularity contests, and subsequently shape policy — are often the least qualified to hold those opinions, while actual climate scientists hedge their statements with uncertainty that gets interpreted as weakness rather than expertise. We've created an information ecosystem where confident ignorance spreads faster than nuanced understanding, not through malice but through the fundamental architecture of human metacognition.
Adrian Ward's research at the University of Texas demonstrated another aspect of this: Participants who could search for information rated their own memories as sharper, even for information they had never searched. We literally cannot distinguish between what we know and what we could theoretically find out.
This is the sophistication illusion: We believe we are advanced beings who understand our world, when we are actually Stone Age brains using tools we don't comprehend, following instructions we can't explain, believing we understand processes that remain complete mysteries to us.
Here’s a sort of hopeful closing note: The reverse implications are also true, in that we can be sure that if we did education right, we could certainly build a society that systematically overcomes these limitations.
SECTION 2: THE SCALE BLINDNESS
Now let us examine what happens when these temporarily educated Stone Age brains encounter large numbers.
Professional statisticians — people who dedicate their lives and careers to understanding numerical relationships — learn early that human intuition about large numbers is not merely weak; it is fundamentally wrong. They must learn specific techniques, both mathematical and cognitive, to overcome these limitations. And even then, they frequently fail.
Let's begin with a simple demonstration. Imagine a chessboard. Place one grain of rice on the first square. Place two grains on the second square. Four on the third. Eight on the fourth. Continue doubling for all 64 squares.
How much rice do you need?
Most people guess thousands of grains. Maybe millions. The actual answer is 18,446,744,073,709,551,615 grains (eighteen quintillion, four hundred forty-six quadrillion, seven hundred forty-four trillion, seventy-three billion, seven hundred nine million, five hundred fifty-one thousand, six hundred and fifteen). Even as I write it, I literally cannot comprehend it. The end result would require more rice than has been produced in all of human history. The pile would weigh approximately 461 billion tons, which exceeds the mass of all human-made objects combined.
This is just a result of neural architecture and normal childhood development. Your brain cannot process exponential growth because nothing in the ancestral environment grew exponentially. Population growth, resource accumulation, territorial expansion — all grew linearly or cyclically in the environments that our cognition evolved through.
But climate change is exponential. CO2 accumulation accelerates. Feedback loops amplify. Ice melt increases absorption, which increases heat, which increases melt. Our brains are incapable of intuiting these relationships.
Consider the practical implications through this comparison:
One million seconds equals 11.5 days — easy-peasy, lemon-squeezy. Makes sense.
One billion seconds equals 31.7 years — holy cows and horses. What a leap.
One trillion seconds equals 31,710 years — instant incomprehension.
Most people use million, billion, and trillion almost interchangeably, as if they were just "big numbers." But the difference between a million and a billion is approximately a billion. When we hear that humans emit 40 billion tons of CO2 annually, our brains process this the same way they would process 40 million or 40 trillion. The numbers are all filed under "big," with no meaningful distinction.
The Monty Hall problem provides another window into our cognitive limitations. Anyone who’s studied psychology will recognize this one as instantly as the name Phineas Gage. So, you're on a game show. Three doors: behind one is a car, behind the other two are goats (part of the premise is that you want the car, not the goats, so just temporarily re-align your priorities). You choose door #1, but it isn’t opened right away. The host, who knows what's behind each door, instead opens door #3, revealing a goat. He asks if you want to switch your choice to door #2.
Should you switch?
The answer is yes — switching doubles your odds from 1/3 to 2/3. But this problem broke the minds of professional mathematicians when it was first publicized. Paul Erdős, one of the most prolific mathematicians in history, refused to accept the correct answer until shown a computer simulation.
If our brains fail at three doors and two goats, how can they possibly handle planetary climate systems, so large that accurately modeling them remains a dream, with thousands of variables and feedback loops?
The COVID-19 pandemic provided a real-time demonstration of exponential blindness. Despite daily data, clear mathematics, and constant expert warnings, most people — including national leaders — could not grasp exponential spread until hospitals overflowed. We need visceral, immediate consequences to understand scale. Numbers alone don't work.
In January 2020, epidemiologists were screaming about exponential growth. They showed the math: One case becomes two, becomes four, becomes eight. Within weeks, you have millions. But human brains heard "it's just a few cases" and couldn't process the mathematical inevitability. Even when shown the graphs, people said "it's just like the flu" until bodies accumulated in refrigerated trucks. People outside of it heard “one- to two-per-cent mortality rate” and thought, so what?
This reveals a crucial limitation: We don't understand through numbers. We understand through stories, images, and immediate sensory experience. A single photograph of a dead child on a beach moves us more than statistics about thousands of drowning refugees. This isn't callousness; it is not a choice we are making — it's neural architecture.
Robin Dunbar's research revealed another critical constraint. The human brain can maintain approximately 150 stable social relationships. This "Dunbar's number" represents the cognitive limit for tracking social dynamics, reciprocal obligations, and interpersonal hierarchies. It's the maximum size of a cohesive social group without formal organizational structures.
In other words, despite global access through the internet, each of us is influenced by a relatively tiny circle of information. And a lot of that information is with social constructs, not actual people. We can’t think of a media source as made up of thousands of individuals, many of whom disagree with each other and work at cross-purposes, or have wildly different motivations. No, we just hear The Washington Post or The Guardian, and treat it as one individual. “The Guardian said,” rather than, “This particular writer at The Guardian said this, and here’s their total life and career context.”
But climate change requires understanding the collective impact of 8 billion humans. That's 53 million times larger than our cognitive capacity. It's not just that we can't comprehend 8 billion — we can't even comprehend the multiplier needed to reach 8 billion from what we can comprehend.
Our brains evolved to understand our tribe, our valley, our hunting grounds. Maybe the neighboring tribe. Perhaps seasonal patterns over a few years. But century-scale changes across continents involving billions of actors and trillions of interactions? We have no neural equipment for this. It would be like asking a calculator to run a modern video game.
SECTION 3: THE BROKEN DEFAULT SETTINGS
This brings us to perhaps the most damning evidence of our cognitive inadequacy: The entire field of cognitive behavioral therapy, or CBT, exists because human default thinking is so fundamentally un-related to reality that it creates mental illness. Inevitably, in fact, and we’re still stuck at the point where we think of CBT as a necessary retroactive treatment for when something goes wrong, instead of what it is: A fundamental, basic need, without which confusion and distress are predictable outcomes.
CBT, along with its variants like Acceptance and Commitment Therapy (ACT) and Dialectical Behavior Therapy (DBT), represents the most empirically validated form of psychotherapy ever developed. Its core insight is simple and devastating: The thoughts that feel most true are often the most wrong.
The foundation of CBT is teaching people to recognize "cognitive distortions" — errors in thinking that feel completely logical but lead to false conclusions. These aren't occasional mistakes. They are the default operating system of the human brain.
Consider the fundamental attribution error. When someone cuts you off in traffic, your immediate thought is "what an asshole." You attribute their behavior to their character. When you cut someone off, you think "I had to get over for my exit." You attribute your behavior to circumstances, and theirs to intention. I cannot overstate that I am not making accusations of personal/moral/character failure when I give these examples — this is how human brains process social information.
Now scale this up to climate change. When we see industrial emissions, we think "greedy corporations." We attribute systemic problems to moral failings of individuals or organizations. We cannot naturally think in terms of emergent systems, market structures, or thermodynamic inevitabilities. Every problem must have a villain, and again there’s that tendency to individualize organizations made up of sometimes millions of people. A group of maybe 10-15 made decisions that led to things like the Exxon Valdez and other disasters. But we almost never hold those individuals accountable or even learn their names.
Confirmation bias compounds our distortions. In our current information environment, you can find "evidence" for literally any position about climate change. It gives us an easy path to believing what we want, instead of what there’s evidence for. It feels good, just like religion. Want to believe it's a hoax? Thousands of websites agree. Want to believe we're already doomed? Plenty of data for that too. Want to believe technology will save us? Right this way, sir/ma’am/entity, here's your echo chamber.
Our broken cognition combined with infinite information creates complete paralysis. We don't evaluate evidence — we collect ammunition for our existing beliefs. And the entire time, we feel like we're being rational.
The concept of emotional reasoning — another cognitive distortion identified in CBT — scales catastrophically to climate issues. "It's cold today, therefore global warming isn't real" is emotional reasoning at planetary scale. The feeling (cold) becomes evidence for a conclusion about complex atmospheric systems. This isn't stupidity (well… this one is maybe a little stupid) — it's default human cognition. Stone Age cognition.
Mind reading, another classic distortion, manifests as "They don't really believe in climate change, they just want to control us." We assume we know others' motivations, projecting our own fears and biases onto their actions. This makes good-faith discussion impossible. It’s like having a conversation with someone in the throes of a schizophrenic episode. It doesn’t matter what you say — that just is not what they are hearing.
Catastrophizing seems inappropriate to mention when discussing actual catastrophe, but it reveals something crucial: Human brains cannot calibrate threat assessment at scale. We catastrophize about minor personal issues while minimizing planetary ones. The same person who has a panic attack about a five-minute work presentation remains calm during a discussion on civilizational collapse, death camps, World War, or plagues.
Mental filtering means we focus selectively on information that confirms our emotional state. Depressed individuals notice only negative information. Climate optimists see only technological solutions. Climate pessimists see only tipping points. We cannot hold the full picture because our brains actively filter out contradictory data.
Here's the truly damning part: Learning about these cognitive distortions doesn't automatically fix them. CBT requires months and years of deliberate, focused practice to partially overcome default thinking patterns. It involves homework, exercises, constant vigilance against your own thoughts. And most people never do this work.
Even more troubling, to say the least: Success in our current system often requires maintaining these cognitive distortions. The CEO who acknowledges systemic problems rather than claiming personal responsibility for success doesn't remain CEO very long. The politician who admits uncertainty loses to the one who projects false confidence. Our power structures select for cognitive distortion.
SECTION 4: THE DEMOCRACY PARADOX
The democratic implications are a teensy bit doomy-gloomy. Democracy requires citizens to evaluate complex issues and make informed decisions. But if accurate thinking about global issues requires specialized training that most people never receive, how can democratic processes address planetary problems?
This doesn't mean democracy is doomed — it means we need to understand what democracy actually does. Democracy is not a perfect system; it's a rare example of a practical one. Its genius lies not in aggregating wisdom but in distributing stupidity. By forcing decision-making through multiple checkpoints, competing interests, and mandatory delays, democratic systems constrain the capacity of any individual's cognitive biases to destroy everything at once.
As "The Dictator's Handbook" (Bruce Bueno de Mesquita and Alastair Smith) demonstrates in excellent argumentation, democracy works not because voters are wise but because leaders must satisfy broad coalitions to maintain power. Autocrats can survive by satisfying a few key supporters; democratic leaders must at least pretend to serve millions. This structural difference — not moral superiority — explains why democracies tend toward better outcomes despite being populated by the same cognitively limited humans.
But here's what our education never taught us: Democracy is exponentially harder than authoritarianism. Surrendering decision-making to a supreme leader, whether political or religious, is our cognitive default — it's what those Stone Age brains want to do. Every human culture independently invented god-kings, prophets, and divine rulers because outsourcing complex decisions to authority figures feels natural. Catholics hand moral reasoning to the Pope, Muslims to their prophet's teachings, and even secular societies constantly attempt to create new versions of the same surrender — the CEO who'll fix everything, the movement leader who has all the answers, the algorithm that knows best.
Democracy requires us to fight this neurological gravity every single day. It demands participation from people who don't want to participate, education for those who resist learning, and constant vigilance against our own desire to just let someone else handle it. The miracle isn't that democracy struggles with planetary-scale problems — the miracle is that it functions at all given the cognitive architecture it must work with.
Clearly, we are seeing, it doesn’t go far enough.
SECTION 5: THE PERMANENCE OF PRIMITIVE THOUGHT
The intentionality bias reveals how deeply these limitations run. Humans evolved to see intention everywhere — it's better to mistake a stick for a snake than a snake for a stick. This hyperactive “agency detection” saved our ancestors but dooms our present.
We cannot help but see climate change as intentional. Either it's a deliberate hoax (someone's trying to control us) or it's evil corporations destroying the planet (someone's choosing profits over survival). We cannot naturally conceptualize emergent systems without intention. Everything must be someone's fault.
This is embedded in our very language, demonstrating both how natural it is and how difficult it might be to choose something different. It makes sense to us that “the rain ruined our run” or, “the universe just hates me today” or “well, my computer has decided not to work.”
This makes solutions impossible. If climate change is seen as moral failure, the solution must be moral reformation. If it's seen as conspiracy, the solution must be exposing the plot. We cannot see it as what it is: An emergent property of billions of actors following local incentives within a system that lacks global coordination mechanisms.
The sunk cost fallacy operates at civilizational scale. We've invested so much in fossil fuel infrastructure that abandoning it feels like admitting failure. This isn't rational — past investments shouldn't determine future decisions — but it's how human brains work. We throw good money after bad, good years after wasted ones, good futures after doomed presents.
SECTION 6: THE TOOLS WE DON'T USE
Statistical thinking offers tools to overcome these limitations, but they require conscious, deliberate application that runs counter to intuition. I can also personally testify that the mental investment required can feel impossible.
Simpson's Paradox shows how the same data can demonstrate opposite conclusions depending on how it's grouped. A treatment can appear harmful overall but beneficial in every subgroup, or vice versa. Climate data is full of such paradoxes. Local cooling during global warming. Increased snow from higher temperatures.
Our intuitions break completely.
Survivorship bias means we only see successful examples, not failures. We study civilizations that survived, not the hundreds that collapsed. This makes us overconfident about our own survival. The Mayans, Romans, and Easter Islanders also thought they were special. They also had complete mythologies whose premise was that a supernatural intentionality created everything (EVERYTHING) just for them.
Base rate neglect explains why we believe technology will save us. We ignore the base rate of civilizational collapse (which is high) and focus on our feeling of technological capability (also high, but irrelevant to the question). The fact that we can imagine solutions doesn't mean we'll implement them.
The prosecutor's fallacy — confusing conditional probabilities — leads to arguments like "Climate has changed naturally before, therefore current change is natural." This reverses the logical relationship. It's like saying "Innocent people have fingerprints, therefore finding fingerprints proves innocence."
These tools exist. Statisticians use them daily. But they require constant vigilance against natural, intuitive thinking. I hope I’ve convinced you of how difficult that is. Even professionals make these errors when they're tired, emotional, or working outside their specialty. And we're asking eight billion people to apply them to the most emotional topic imaginable: Potential civilizational collapse.
CONCLUSION: THE COGNITIVE CARBON LOCK-IN
We have documented the cognitive architecture that makes climate response neurologically impossible at scale.
Stone Age brains that require extensive education to function in modernity, but most never receive that education. Numerical processing systems that cannot distinguish between millions and billions, making planetary-scale problems literally unthinkable. Default thought patterns so broken they constitute mental illness, requiring intensive therapy to partially overcome.
These are not bugs to be fixed. These are the fundamental features of human cognition. We evolved to solve immediate, tribal-scale problems through intuition and emotion. We now face planetary-scale, multi-decade problems requiring statistical thinking and systemic analysis.
The infrastructure we examined in Episode 2 — the Bagger 293, the pipelines, the 100 million barrels per day — was built by minds that cannot truly comprehend what they've built. We are cognitive primitives who have accidentally constructed a planetary machine we lack the neural architecture to understand or control.
Until we acknowledge the depth of our cognitive inadequacy, the delay in our evolved capacities — not as individuals but as a species — we cannot begin to design systems that explicitly compensate for these limitations.
The fantasy of eight billion humans spontaneously developing statistical thinking and overcoming cognitive biases through education is exactly that — a fantasy that itself represents multiple cognitive distortions.
Next time on The Acceleration, we'll examine what kinds of responses remain possible when we accept that human cognition cannot be fixed, only worked around. The solutions, as we'll see, require abandoning our most cherished and idealistic assumptions about democracy, individual agency, and human nature itself.