Spiritual Bypass, but with GPUs
The "Machinal Bypass," when AI becomes a shortcut around the work that makes us human, and why some tasks should feel a little hard.
I just read a short PNAS Opinion piece with a phrase that I can’t stop thinking about: machinal bypass. Here’s the link to the short article:
D.M. Kaplan, R. Palitsky, & C.L. Raison, The “machinal bypass” and how we’re using AI to avoid ourselves, PNAS 122 (51) e2518999122, DOI: 10.1073/pnas.25189.
The authors (a psychiatrist and two clinical psychologists) use it to name “the use of generative AI not just to support human innovation or connection, but to sidestep it altogether.” The framing isn’t really anti-AI. They are trying to put language around a very specific temptation: when something asks for our presence, our uncertainty, our relationship, we reach for the tool that helps us avoid feeling exposed. Their heuristic is simple:
If a task asks that you—with your unique lived conscious experience—be present, then substituting generative AI content for your own is machinal bypass.
They connect this to an older psychological idea I already felt in my bones but never had a name for: spiritual bypass. This is when spirituality is used to dodge difficult emotions or life’s complexity, like saying “everything happens for a reason” to someone in deep grief, or refusing a hard conversation because you will “just pray about it.” I grew up in a religious community, and I remember noticing that dynamic early, the way a tidy spiritual answer can sometimes function like a trapdoor or cop-out (prime example: “thoughts and prayers” after the daily tragedies we have here in the U.S). I just didn’t know there was a term and an entire body of literature behind it.
There is an established concept that describes a similar type of avoidance: “spiritual bypass”, first coined by John Welwood in 1984. Spirituality can be an important resource for coping, insight, and growth. However, when it is used to escape having to deal with difficult emotions or life’s complexities, that is spiritual bypass. For example, saying “everything happens for a reason” to someone in the depths of grief bypasses the more difficult, but essential, work of acknowledging and being present with someone’s pain. Another example is refusing to have an important, but difficult, conversation while asserting that all one needs to do is to pray about it. An illustrative item from the Spiritual Bypass Scale is: “It is more important for me to be spiritually awakened than to feel emotionally intact.” It is sometimes referred to as “avoidance in spiritual drag.”
The essay argues that AI is a silicon cousin of the same move: “when we want a quick answer […] we can turn to a machine for this type of solution.”
And that’s where it hit home for me. Every time I use AI, especially for writing, I feel the bargain. Yes, I saved time. But I also took a shortcut around the discomfort and the productive struggle that is often the whole point.

In writing especially, I feel a little atrophy every time I use this shortcut. If I let a model draft the paragraph, I skip the searching, the false starts, the slow arrangement of my own thinking. Over time, I worry that my “voice” might become something I curate rather than something I practice.
With code, I am mostly fine letting AI take the boring parts. If it saves me 15 minutes of writing a unit test or wiring up yet another small helper function, great. Writing feels different. In 2026 I want to be more intentional: I can ask AI to push back on an outline, point out where I am being vague, offer a tighter transition, or show me a few alternate phrasings. But I’m not going to let AI produce my first drafts. I want the thinking to stay attached to my own sentences, even if the productive struggle makes me a little slower.
And besides… AI can’t experience writer’s block. AI can’t hate its own writing. AI can’t feel like a loser because “you should be writing” but “hey the dishwasher is clean maybe I should do that first…” I think these are essential parts of the writing process, and partly why AI will never produce great writing on its own.

Stephen: I mostly read your columns for the data science and computational bioinformatics posts. However, this branching into philosophy and worrying about our human condition is an elegant analysis of a phenomenon we are seeing too much of and, as you highlight, an extension of the "spiritual bypass" of recent decades that has distanced us as people from truly grappling with the slings and arrows of this rapidly evolving civilization. I will be sending this to my nearest and dearest, both familial and professional.
Wonderful thought-provoking nugget, thank you