You're standing at a lever.

On one track, five people are tied down. On the other track, one person. A trolley is coming. If you do nothing, five people die. If you pull the lever, you divert the trolley and one person dies instead.

What do you do?

This is the trolley problem—philosophy's most famous thought experiment, introduced by Philippa Foot in 1967 and refined by Judith Jarvis Thomson in 1985. It's been debated in ethics classrooms for decades. Most people, when surveyed, say they'd pull the lever. Save five, sacrifice one. The math is simple. The morality seems clear.

Now change the numbers.

On one track: five billion people. On the other: two billion.

The lever is still in your hand. You still have to choose. But the math doesn't feel simple anymore, does it?

This is the moral engine of The Genesis Protocol. And the reason it haunts me is that the people pulling the lever believe—genuinely, with evidence they consider rigorous—that they're making the right choice.

The Logic of Triage

The trolley problem assumes a binary: act or don't act, and people die either way. Real moral choices are rarely this clean. But the underlying logic—sacrifice the few to save the many—has a name in practice.

It's called triage.

Every emergency room doctor understands triage. When resources are limited and patients are many, you categorize: who can be saved, who can wait, who is beyond saving. You direct attention where it will do the most good. This means, inevitably, that some patients receive less care than they need. Some die who might have been saved if resources were unlimited.

Triage is brutal. It's also necessary. And no one seriously argues that emergency room doctors are monsters for practicing it.

The question is what happens when you scale triage from an emergency room to a civilization.

Churchill's Dilemma

On November 14, 1940, British intelligence intercepted and decoded German communications indicating a massive bombing raid planned for that night. The target: Coventry.

Churchill faced a choice. He could evacuate Coventry and save thousands of lives—but doing so would reveal to the Germans that Britain had broken the Enigma code. The intelligence advantage that Enigma provided was, by most military assessments, worth more than any single city. It would shorten the war. Save millions.

Or he could let Coventry burn.

The historical record on what Churchill actually knew and decided is debated. But the structure of the dilemma is real and recurs throughout history. Leaders facing catastrophic choices regularly calculate lives against lives, present suffering against future suffering, the visible dead against the statistical dead.

Truman and the atomic bomb. Lincoln and the continuation of the Civil War. Every wartime leader who sent soldiers into battles they knew would produce casualties in service of strategic objectives.

The pattern: someone with authority decides that a smaller catastrophe now prevents a larger catastrophe later. They act. People die. History judges them—sometimes as heroes, sometimes as monsters, often as both simultaneously.

The Seductive Logic

Here's what makes the civilizational trolley problem dangerous: the math works.

Not morally. Mathematically.

If you genuinely believe that human civilization is approaching a collapse event—and there are serious, credentialed scientists who argue this—then the utilitarian calculus becomes seductive. If unchecked growth leads to resource depletion leads to systemic collapse leads to billions of deaths, then a managed reduction in population that preserves genetic diversity and civilizational infrastructure could, on paper, result in fewer total deaths.

The numbers add up. That's the horror of it.

In The Genesis Protocol, the antagonists aren't cackling villains who enjoy human suffering. They're scientists and administrators who've done the math. They've modeled collapse scenarios. They've run the projections. They've looked at population growth curves and resource consumption rates and climate models and pandemic risks and concluded that uncontrolled collapse will kill more people than controlled intervention.

Their plan—THRESHOLD—is monstrous. But their spreadsheets are meticulous.

This is what makes them terrifying. A villain who's wrong is just an obstacle. A villain who might be right about the math is something worse.

Where the Logic Breaks

The utilitarian calculus has a fatal flaw, and it's not where most people think it is.

The standard objection to the trolley problem is emotional: it feels wrong to pull the lever, even if the math says you should. This objection, while psychologically real, isn't philosophically satisfying. Feelings aren't arguments.

The real flaw is epistemic. It's about knowledge.

The trolley problem works because it provides perfect information. You know five people are on one track. You know one person is on the other. You know the trolley is coming. You know the lever will work.

Real-world decisions never have perfect information.

Churchill didn't know for certain that protecting the Enigma secret would shorten the war. Truman didn't know for certain that the bombs would end the Pacific war without a land invasion. Every leader who makes a sacrificial calculation is operating on models, projections, estimates—educated guesses dressed up in the language of certainty.

And here's where it gets dangerous: the more complex the system, the worse the models perform.

Climate models can predict global temperature trends but not regional weather patterns. Economic models can identify systemic risks but not predict which specific shock will trigger a crisis. Population models can project growth curves but not account for technological breakthroughs, cultural shifts, or the thousand variables that make human civilization irreducibly complex.

The THRESHOLD planners in The Genesis Protocol have excellent models. Detailed projections. Rigorous methodology. They're also wrong—not because their math is bad, but because their math can't capture the full complexity of the system they're trying to manage.

No math can. That's the point.

The Deeper Problem

There's a second flaw in the civilizational trolley problem, and it's more fundamental than the epistemic one.

The trolley problem asks: should you sacrifice the few to save the many? But it assumes a framework in which someone has the right to make that choice. Someone stands at the lever. Someone decides.

Who gave them the lever?

This is the question The Genesis Protocol keeps circling back to. The THRESHOLD planners have decided that they possess sufficient understanding of human civilization to determine who lives and who dies. They've appointed themselves the operators of a lever that affects every human being on Earth.

Their authority for this decision is: they're the ones who built the lever.

This is the logic of every authoritarian project in history. We know better. We see further. We understand what you can't. Trust us with the lever.

The book's answer—and mine—is that the lever itself is the problem. Not who pulls it. Not which direction. The premise that any individual or institution should have the power to make civilizational triage decisions is the flaw. Not because the math might be wrong (though it will be). Not because the people at the lever might be corrupt (though they might be). But because the existence of the lever transforms every human being on Earth from a person into a variable.

And people aren't variables.

Morrison's Choice

Without spoiling the novel: one character in The Genesis Protocol embodies this dilemma perfectly.

He's a scientist. A good one. He joined the project because he believed in the research—genuinely, with the kind of conviction that comes from having spent years studying civilizational risk and concluding that the threat is real. He's not evil. He's not greedy. He's afraid.

He's afraid because he's done the math, and the math says collapse is coming. And the only tool anyone's offered him to prevent it requires pulling a lever that will kill billions of people.

His journey through the book is the trolley problem made personal. Not abstract. Not philosophical. A human being standing at an actual lever, with actual lives on both tracks, trying to decide whether the math that brought him here is sufficient justification for what the math demands.

I won't tell you what he decides. But I will tell you this: the decision doesn't come from the math. It comes from somewhere the math can't reach.

Why This Problem Won't Go Away

We are building levers.

Artificial intelligence systems that make decisions affecting millions of people. Genetic engineering technologies that could alter the human genome. Climate interventions—geoengineering proposals—that would modify Earth's atmosphere on a planetary scale. Pandemic response frameworks that determine who gets vaccines first and who waits.

Each of these technologies creates a trolley problem. Each one forces someone to stand at a lever and make choices that affect people who never consented to being on the tracks.

The question isn't whether we'll face civilizational-scale trolley problems. We already are. The question is whether we'll face them with the humility to recognize that our models are incomplete, our knowledge is partial, and the people on the tracks are not variables in an equation.

The trolley problem is a thought experiment. It's supposed to be uncomfortable.

The moment it becomes comfortable—the moment someone picks up the lever and feels confident—that's when you should be afraid.