Why I use AI for thinking—not producing—and what I’m quietly protecting by moving slower than I could.
There’s a particular kind of exhaustion that comes from making things nonstop while feeling further and further away from yourself.
For a long time, I assumed that feeling was a personal failure. A discipline issue. A motivation problem. Something to solve with better systems or clearer goals.
Now I think it’s something else.
I think it’s what happens when pace stops being a choice.
AI as Continuity, Not Output
I use AI every day.
But not in the way most conversations about AI assume.
For me, it’s primarily a private partner in sense-making—a place where thoughts can accumulate without pressure to resolve. Where half-formed ideas can sit beside each other. Where connections can surface slowly, without the demand to immediately become content.
It functions less like a machine for answers and more like continuity.
What it does not do—at least not in my practice—is author finished work for public consumption. Especially not in forms like image, video, or music.
That distinction matters to me more than the tool itself.
Why Some Things Need to Disappear
There are forms of meaning that only emerge because they can’t be endlessly revised.
These forms depend on interruption, loss, and irreversibility. On the fact that something happens once and then becomes memory instead of artifact.
Infinite iteration flattens that.
Not because iteration is bad—but because impermanence is doing real work.
When every early sketch is preserved forever, when every experiment is archived, optimized, and evaluated, something fragile gets crowded out: the right to fail quietly. The right to abandon ideas without explanation. The right to grow past work that no longer represents you.
What I’m protecting isn’t nostalgia for craft, or some performance of authenticity.
I’m protecting transience.
Withholding as an Ethical Act
We live in a culture that remembers everything and rewards constant visibility.
In that environment, withholding starts to look suspicious. Like fear. Or scarcity. Or lack of confidence.
But I’ve come to see restraint differently.
Not as hiding—but as incubation.
Not as delay—but as care.
Some ideas need privacy to become honest. Some versions of ourselves need room to exist without witnesses. Some work deserves the chance to fail completely, to vanish, to be outgrown without leaving a permanent record.
Choosing not to share everything isn’t a tactic.
It’s a value.
Pace Is a Moral Choice
I’ve lived through enough technological “revolutions” to recognize the pattern.
And each time, culture and capitalism reclaim the surplus.
No one actually gets time back.
So at some point, pace stops being about productivity and starts being about ethics.
I’m intentionally building a life and business organized around sustainable pace, care, and depth—not because it’s optimal, but because it’s humane.
That choice is my quiet resistance to time extraction.
Longer Thinking, Not Faster Publishing
Ironically, this is why I’m excited about large language models.
When used privately and intentionally, they don’t accelerate output—they reclaim mental space. They allow ideas to stretch. To connect across longer arcs of time. To stay unfinished without being lost.
They support longer thinking, not faster publishing.
That distinction feels critical in a moment when everything is pushing toward immediacy.
The Question Beneath the Tools
So the real question, at least for me, isn’t whether AI is good or bad. Or how much to use it. Or how fast to create.
It’s something quieter.
I don’t think there’s a single right answer.
But I do think there’s value in choosing deliberately—especially when the default is acceleration.
That’s where the work gets honest.
And that’s where I’m choosing to stay—for now.

