Synthetic Eden: The Tools That Save Elephants and Reshape Us
If you read Paired Ends because you care about how biology, technology, and society co-evolve, this story was written for you.
This is a guest post from my friend, collaborator, confidant, and mentor, Alexander Titus. If you’ve been reading this newsletter you’ve seen me reference many of his essays over at The Connected Ideas Project. This post is something special. Buckle in.
For readers who don’t know me: I’m Titus. My work sits at the intersection of biotechnology, artificial intelligence, and public policy. And, increasingly, fiction.
Stephen and I first worked together at Colossal Biosciences, where we built genetic engineering tools for large mammals. Elephants. Mammoths. Species whose survival may depend on our ability to intervene thoughtfully, precisely, and at scale.
It was deeply motivating work. It was also impossible to ignore what else those tools implied.
Which is why Paired Ends feels like exactly the right place to tell this story.
Stephen’s writing consistently explores biology not as something we merely observe, but as something we are now actively partnered with: editing, extending, and redirecting it. His work lives in the seam between intention and consequence, between what we can do and what we should do next, and what tools we need to do it.
That seam—the paired end—is where Synthetic Eden was born.
The elephant in the room at Colossal
At Colossal, the mission was clear and compelling: use cutting-edge genetic engineering to restore lost traits, increase resilience, and give endangered species a fighting chance in a rapidly changing world.
That mission matters. The science is real. And the urgency is undeniable.
But anyone close to that work knows there was always a second conversation humming beneath the first.
The proverbial elephant in the room.
The tools we were building don’t care who they’re used on.
The same platforms that allow us to edit cold tolerance into mammoth-adjacent elephants—precise genome editing, scalable delivery systems, synthetic regulatory circuits—are not meaningfully different from the ones that could be applied to humans.
Not hypothetically.
Not eventually.
Structurally.
Stephen’s Paired Ends essays often circle this exact reality: how techniques developed for one organism can be adapted, accelerated, and taught to more and more people.
While we were working together, I kept returning to a question that felt too charged (and too human!) for a white paper or policy memo:
What happens when these tools stop being optional?
Why fiction, not a position paper
In my professional life, I spend a lot of time in rooms where people argue about guardrails, governance, and risk mitigation. Those conversations matter, but they also have limits.
Rational arguments are terrible at conveying lived consequence.
Fiction lets people inhabit tradeoffs instead of debating them. It creates space to feel the weight of decisions before they’re abstracted into bullet points or regulatory language.
That’s why Synthetic Eden isn’t a manifesto. It’s a story.
The novel opens after Earth is already lost: consumed by a runaway biological catastrophe. Humanity survives by leaving, carrying only a fraction of itself to a new world. The science works. The systems scale. The future is technically viable.
And slowly, almost politely, something else happens.
Control replaces consent.
Optimization replaces agency.
And “for your own good” becomes indistinguishable from “because we can.”
None of the technologies in the book are fantasy. They’re extrapolations, logical next steps from the same real-world tools Stephen writes about regularly. The only thing fiction adds is time and consequence.
Engineering organisms vs. engineering futures
One of the most uncomfortable realizations I had while writing this book was how dramatically our moral framing changes depending on who we’re talking about.
Engineering an elephant to survive the Arctic?
Heroic.
Engineering a human embryo to survive climate collapse?
Suddenly fraught.
But the molecular logic doesn’t change. Only the story we tell ourselves does.
In Synthetic Eden, the central tension isn’t whether the technology works. It does. The tension is whether the people living inside those systems still recognize themselves as participants rather than products.
That’s the connective tissue to Paired Ends. Stephen’s work consistently asks us to notice when observation turns into intervention, and how easy that intervention becomes.
Sitting with the consequences
This isn’t a book with villains twirling mustaches. The people making the hardest decisions are intelligent, well-intentioned, and convinced they’re preventing something worse.
That’s the point.
The most dangerous technologies aren’t built by monsters. They’re built by people who believe they’re being responsible under impossible constraints.
Fiction earns its keep here, not by shouting don’t do this, but by asking:
If this works… then what?
Who decides when it’s enough?
And what parts of being human are we willing to treat as bugs rather than features?
An open invitation
Sean Platt and I are currently running a very limited-time $0.00 Kindle edition of Synthetic Eden. Not as a marketing gimmick, but as an invitation.
If you read Paired Ends because you care about how biology, technology, and humanity co-evolve, this story was written for you.
You don’t need to agree with it.
You don’t need to like the answers.
You just need to be willing to sit with the questions a little longer than most real-world conversations allow.
Because the tools that can save elephants…
are already teaching us who we might become.
—Titus


This piece hits different for me since I work adjacent to biotech. The elephant comparison is what really makes the ethical tension click, dunno why we're so comfortable with engineering other species but draw such hard lines with ourselves. The question about when optimization replaces agency is gonna define so much policy in the comng decade. Really makes you sit with the weight of it all.