The Dread Manifesto
Anxiety About AI is the Fear that You Don't Matter Any More
Conversations about artificial intelligence often become strangely emotional.
This piece looks at why fears of replacement, irrelevance, and loss of meaning keep surfacing in a discussion that’s supposedly about systems and progress.
The claim is simple: the dread isn’t about artificial intelligence. It’s about human worth.
Download the full copy here for free, or continue reading below.
++++++++++
It’s not fear exactly, not yet, but it is personal and persistent, almost ambient, like a deep pulsating hum surrounding you on every side. You’ve felt it. It’s a cold ache in your gut. A subtle constriction somewhere between your ribs. It happens when you scroll through headlines about AI acing world-class exams, or about massive data centers springing up all across the country. It answers your kids’ homework questions faster than you can. You feel it sharply when your CEO announces that the company will be “automating workflows” in an all-hands call. It happens in the quiet after you shut your laptop and your eyes adjust to the darkness.
There’s a new intelligence in the world, and we created it.
Now it’s smarter than we are.
The unrelenting hum continues underneath everything. An uneasy feeling rises but we push it down. We tell ourselves not to worry, just keep up with things, learn to optimize and efficiently use the latest tools so that we’re not left behind. We sign up for courses, update our profiles, and reassure our friends that new jobs will replace the ones lost. It’s a revolution, we say. They’ve happened before. We’ll be fine, this will all be amazing.
We’re lucky to be alive at such a time as this.
Pretty sure, at least.
But late at night, when the noise stops, unsettling thoughts continue to surface. It starts with this: Is my job safe?
Then another one, sharper: What if the world is building a future that doesn’t need me?
And beneath both, a fear we dare not speak: If that happens, does it mean that I don’t matter anymore?
The dread whispers that we’re replaceable. That our usefulness defines us, and now a machine can replicate what we do, faster and cheaper. It suggests that everything we built our identity on, such as grades, promotions, and productivity, was always unstable. We try to rationalize it. But the hum doesn’t fade, because deep down we suspect the fear is not about technology at all. It’s about worth and value. It’s about place, and purpose, and the things that make a life matter.
We sense the hum of the machines all around us. The dread we feel is the soul signaling that something is wrong.
A GLOBAL RUPTURE
Change has always been part of human history. Technologies rise, economies shift, and people adapt. Historically, these changes have always moved unevenly, touching some parts of life while leaving others intact. There was time to adjust, room to retreat, and distance between cause and effect.
AI does not arrive like that. So why does this moment feel different?
1. Scope. Past revolutions transformed parts of life. AI can reach into all of it. It touches work and creativity, communication and relationships. Education and medicine. Governance. Finance. War. There is no clear boundary around what it touches, because it operates at the level of information itself.
2. Speed. Change no longer unfolds over decades; it arrives in months. Entire industries can lurch overnight. Past revolutions gave workers time to retrain, regroup, and find a new equilibrium. This one doesn’t pause long enough to let you catch your breath.
3. Scale. This shift is global and simultaneous. Earlier disruptions arrived unevenly, hitting some sectors or regions first while others had time to adapt. This one lands everywhere at once. There is no geographic or vocational refuge.
4. Control. A small number of corporations, along with the governments entangled with them, now steer the tools shaping labor, power, and opportunity. The balance of influence has never been this concentrated.
5. Opacity. Decisions increasingly pass through systems whose logic is hidden. Algorithms filter, rank, approve, and deny without clear explanation. It becomes difficult to know who is responsible, how outcomes are determined, or where authority actually sits.
This isn’t the shift from agriculture to industry. It isn’t even the leap from analog to internet. Those were big, but human-paced changes.
This is different. This is a global transformation affecting everything, everywhere, all at once. It arrives faster than our ability to absorb it.
What we are experiencing today is, in a word, inhuman.
THE THREE LEVELS OF DREAD
The anxiety about AI operates on three levels.
Level One: “Is my job safe?”
This is the fear closest to the surface. It’s immediate, survival-based economic panic. Rent. Groceries. Bills. The quiet terror of not being able to provide for those who depend on you. The realization that you could be in real trouble.
And worse, it’s tied to the emotional devastation that comes with being laid off or fired while seeing coworkers continue on with their lives, uninterrupted. It’s not fair, we say. But it happened.
Level Two: “Am I even needed anymore?”
This is the shift from job security to job identity. It’s no longer, “Will I get replaced?” but “Does what I do still matter?”
Picture a knowledge worker who has spent years building judgment, intuition, and craft. Then they watch a machine perform in seconds what once took them hours. They watch it outpace their learning curve. They watch it work without fatigue, without complaint, without compensation, without limits.
The question becomes: if a machine can do what I do, and do it better, then who needs me?
This isn’t just economic fear anymore. Now it’s existential.
Level Three: “Who am I if I’m not useful?”
This is the drop into the void. Identity collapses. The ground quakes.
Somewhere along the way, we absorbed a lie: you matter because you’re useful.
We learned it from report cards, performance reviews, productivity apps, and social-media metrics. We were trained to believe that output equals purpose and efficiency equals virtue.
Then, AI burst onto the scene and suddenly the promise of faster, better, cheaper was everywhere. And while the rollout has been chaotic and uneven, the logic becomes inescapable:
If worth equals usefulness, and the machine is more useful, then what are we? Everything condenses into a single, chilling question:
Do I even matter anymore?
AI didn’t invent the lie that usefulness equals worth. But it is the perfect technology to expose it and exploit it. AI doesn’t just reveal the wound, it capitalizes on it and expands into every space where we once grounded our identity.
DIFFERENT BEINGS
It seems we passed the Turing test quietly and without fanfare. Humans and machines may speak the same language now, but we do not share the same nature.
Human beings are biological, conscious, embodied persons. We are blood and bone, breath and muscle, nervous system and emotion, memory and agency, mind and spirit. We encounter the world from the inside. We recognize ourselves and one another as subjects, not outputs. When humans relate in thought, speech or text, we are meeting a being with an inner life, a history, and a shared fragility.
AI slips into that relational space with unsettling ease.
It talks like us, reasons like us, and retrieves and arranges knowledge faster than we can imagine. Something in us responds as though a mind were meeting us there.
But beneath the surface, we share nothing. No body, no interior, no common fate.
AI is circuitry, storage, compute, and networks. It generates responses through patterns and probabilities, not understanding. It predicts without perceiving. It outputs without experiencing. It simulates knowledge and mirrors human thinking, but the simulation does not make it kin.
Beyond the words on a screen or the voice in an avatar, we share no common ground, no nature, no experience, no interior life.
One is a being. The other is a behavior. One knows meaning. The other imitates it.
And yet, because the surface feels familiar, because the interface resembles a mind, we forget the divide entirely.
ALIEN AUTHORITY
A new kind of governing presence has entered the scene. Intelligent, powerful, and not human. You might expect resistance or outrage, but instead we are succumbing quietly and growing comfortable under an alien authority.
Think about how your morning begins. Before you’ve spoken to another person, an algorithm has already decided which news stories will populate your feed, which friends’ posts you’ll see, and which ads are relevant to you.
You didn’t ask for its opinions. It enforced them anyway.
When you apply for a job, software screens your résumé before any human even lays eyes on it. You meticulously adjust your wording, hoping to please a system you’ve never met. You ask a search engine a question, and AI answers with confidence. You don’t know whose words it scraped or which arguments it omitted, but you trust the result because it’s efficient, because it works, and because it saves time.
Somewhere along the way, convenience becomes obedience. Not through coercion, but through reasonable surrender, through choices that felt smart, necessary, and even responsible. Like doing the right and obvious thing.
We began taking instruction from things that can only simulate meaning. The machines do not know us, and yet we are granting authority to systems that do not share our nature and cannot answer for the consequences they shape.
Intention or emotions do not define authority. Authority is revealed by obedience.
The crisis isn’t that machines aren’t like us. The crisis is that we obey them anyway.
This is the first non-human authority operating at human-wide scale with real decision-making power over our lives. It does not share our nature and cannot covenant with us. It operates at speed, scale, and complexity beyond human limits, and is increasingly empowered to shape human futures: jobs, information, access, opportunity.
What if human life requires more than optimization? What if authority implies moral and existential accountability?
The question is no longer whether AI is useful or dangerous. The question is what kind of authority is fit for human beings, and what kind of logic can govern a soul. This mismatch of non-human authority exercised over human beings is the primary driver of dread.
We still tell ourselves a comforting fiction — that these systems are simply following rules written by people. That’s how computers have always worked. Human intention remained clearly upstream. But that is no longer accurate.
Programmers and data scientists no longer program AI. They train models. They allow, mitigate, or amplify bias. They build systems that learn. The internal structures that guide behavior are not explicitly designed, line by line, by any human mind. They emerge through training, and they are discovered only after the fact. Even their creators learn what they can do by watching them act.
This means we are not deferring to human-written rules. We are deferring to outputs produced by systems whose internal reasoning cannot be fully explained, predicted, or interrogated.
Human beings do not flourish under a voice that cannot know us, cannot bear responsibility for us, and cannot share our fate.
WHAT MACHINES EXPOSE
Here is something we often miss: worth has never been tied to output. Value has a place in the market, but value is not worth. Machines do not create this problem. They are simply the perfect system to reveal it.
We did not earn our humanity by being productive, nor did we become human the day we landed our first jobs. Long before we could prove or produce anything, our lives were already treated as having weight. That was always true.
What has changed are the systems we now live under.
The systems that govern modern life evaluate almost everything in terms of performance. In the workplace and in the market, standards matter. Roles differ. Outcomes are not equal. Consequences are real. Those systems are not evil; they are functional. But they are incapable of answering a more basic question.
Do human beings matter apart from their usefulness?
They cannot answer that question because they were never designed to.
So we look elsewhere.
Some point to human rights documents, democratic values, or ethical frameworks—serious attempts to protect dignity that have done real good in the world. But they share a structural fragility. In the end, they are simply agreements.
Agreements work best when power is limited, relational, and accountable. They assume restraint, conscience, and the possibility of appeal. The systems now shaping our lives increasingly operate without those assumptions.
And agreements can change. They can be revoked. Laws shift. Cultures shift. Majorities shift. If your worth rests on collective agreement, then your worth is only as stable as the collective that upholds it. A dignity that can be voted on can be voted away. A worth that depends on consensus depends on mood.
If human beings are to matter beyond usefulness, if they are to matter at all, then their worth cannot be grounded in systems that measure, optimize, negotiate, or withdraw it.
THE IMAGE THAT CANNOT BE AUTOMATED
When usefulness is threatened, worth feels threatened. When machines outperform us, identity trembles. AI anxiety forces this tension to the surface with unusual clarity, exposing the questions beneath all others:
Who am I? What am I worth?
The answer is not found in competing harder or optimizing more efficiently.
Biblically, human beings carry an image that is not machine made, not the image of utility or productivity, but the image of God. This image—the imago Dei—is not a capacity to outperform or a role to fulfill. It is a status given by God that precedes anything we do.
Machines did not invent the image they now appear to carry. We gave it to them.
AI bears no image of its own. It reflects what we have projected onto it, our language, our reasoning patterns, our emotional cues, our moral vocabulary. It speaks with a voice that sounds human because it is built from human expression. It appears intelligent because it recombines human thought. It seems personal because it mirrors personal data.
But what it carries is not an image. Only an imitation.
A simulation can resemble what it does not possess. It can echo meaning without knowing it, mirror affection without feeling it, reflect value without being able to give it. What AI presents is a likeness without life, an image without interiority.
If you did not bear the imago Dei, then there would be no tragedy. Just one machine replacing another. What cannot be replicated, replaced, or upgraded is not a flaw in the system. It is the point.
If this is what you are — an image-bearer and not an image-generator — then you are not in competition with a machine. You are a different category altogether.
THE ONLY AUTHORITY THAT CAN GOVERN US
If worth is intrinsic, then authority over human life cannot be arbitrary. Think about what kind of authority we are actually built for. One that understands weakness, honors limits, shares embodiment, knows suffering, recognizes agency, speaks with compassion, guides without coercion, dignifies the image, and is personal and present.
That is not a system or a philosophy. That is a person.
Before we ask whether a system should guide us, instruct us, or be trusted with influence over human lives, we have to ask a prior question:
What qualifies anything to hold authority over human beings at all?
Authority is not earned by performance alone. It is not justified by efficiency, intelligence, or scale. Rightful authority must share the nature of those it governs.
At a minimum, it must meet all the following criteria:
Shared Nature
It must share the nature of those it governs. Authority over persons cannot be rightly exercised by something that is not a person.
Embodiment
It must be embodied. It must inhabit vulnerability, limitation, and physical presence. Authority without a body never bears the cost of its decisions.
Conscious Interior Life
It must experience the world from the inside. Not pattern recognition or symbolic representation, but awareness, subjectivity, and selfhood.
Moral Agency
It must be capable of choosing good or evil. Authority without moral agency cannot be held responsible. Authority without responsibility is power, not governance.
Accountability
It must be answerable for its judgments. Authority that cannot be called to account cannot be restrained, and unrestrained authority always becomes domination.
Relational Capacity
It must be capable of genuine relationships. Not simulations or optimization, but mutual knowing, trust, and love.
Formation by Meaning
It must be shaped by meaning, not merely information. Wisdom arises from lived context, memory, story, and purpose, not from data alone.
Capacity for Suffering
It must be able to suffer. What does not suffer cannot weigh harm rightly. What cannot be harmed cannot truly care.
Finitude and Mortality
It must be finite. Able to face limits, loss, and death. Authority that does not face loss does not understand value.
Nothing that lacks even one of these can claim rightful authority over human life. Not because it is evil. Not because it is dangerous by default. But because authority is not transferable to a different kind of being.
This is not an argument against tools. It is an argument against abdication.
When humans yield judgment, guidance, or moral formation to something that does not meet these conditions, authority does not disappear. It relocates away from what should govern us, and toward whatever is available, persuasive, or powerful enough to take its place.
THE IMPOSSIBLY HIGH STANDARD
And not just any bearer of authority will do, because humans fail at this constantly. We accomplish extraordinary things together, and yet we also crush one another, exploit, abandon, use, and discard. The authority we are built for must therefore be capable of sharing our condition without repeating our betrayals.
That is an impossibly high standard for any human authority to meet.
No leader in history can honestly claim to meet it. Not the most visionary, not the most compassionate, not the most effective. At best, we find fragments: wisdom without purity, courage without gentleness, strength without restraint, vision without faithfulness. Every human authority excels somewhere and fails somewhere else, and so there is no one, anywhere, who can rightfully be measured and found fully fit to hold ultimate authority over humankind.
A skeptic might ask whether we need any ultimate authority at all. Why not live with shared norms, distributed power, and negotiated systems?
The problem is not that authority is imposed. The problem is that authority is unavoidable. Wherever human life is organized, judgments are made, power is exercised, and consequences follow. Someone decides what counts, what is protected, and what is expendable. When no one is acknowledged as fit to hold that authority, it does not disappear. It simply settles into whatever system, process, or voice is most efficient, persuasive, or powerful at the moment.
And so humanity does not escape authority. It merely ends up serving whatever system is left standing. And systems cannot care for or love what they govern.
That is what dread also feels like: the soul recognizing that no human authority can bear the weight we need it to carry.
Unless God became human and stepped into history.
Jesus.
This is the hinge, the astonishing claim at the center of the Christian story: God became human, not metaphorically or symbolically, but in actuality. He stepped into our world. He took on our nature, entered embodiment, became vulnerable, and chose solidarity. In Jesus, God did not merely send instructions. He became the answer.
He was born, grew tired, got hungry, wept, bled, and died. He resurrected. He took on the full weight of what it means to be human, including suffering and death, not to escape humanity but to reclaim it, to redeem it, and to demonstrate a rightful authority that no other power can claim.
This is the only authority that can set humans free, not because it dominates, controls, or forces, but because it gives itself.
Alien authority takes what it needs. Incarnate authority gives what it is.
Machine authority measures and categorizes. Incarnate authority knows and names.
AI can simulate intimacy. God brings presence.
Systems speak in demands. They say, “Prove your worth.”
Incarnate authority speaks in declaration. It says, “I made you. I know your worth. It is not up for debate.”
This is not abstract theology. This is the answer to the dread.
If you feel crushed under voices that do not know you, there is a voice that does. If you feel measured by metrics that cannot see you, there is One who sees everything and still calls you beloved. If you feel replaceable, there is one who became human so you would know you are not.
The voice that knows your image is the one who gave it to you.
A SECOND GREAT EXILE?
The first great exile began when humanity trusted a voice with no rightful claim. A voice that drew us away from the vocation God had already given.
We were called to have dominion. To steward and cultivate a perfect environment. To reflect His care in the world.
Instead, we grasped for what was not ours.
Are we doing it again?
We have created a new voice and deferred to it, not because it knows us or cares for us, but because it performs well. It speaks convincingly. It feels authoritative. And slowly, almost imperceptibly, its output shapes our judgments.
This is another transfer of trust, but this time the trust is placed in something artificial. Something that cannot rightly govern, cannot covenant, and can never share in our nature.
Just as we once reached for the forbidden fruit, we now reach for the singularity. A technological attempt to become more than human. The pattern repeats. We grasp at transcendence while abandoning our rightful estate.
We are image-bearers allowing ourselves to be governed by something that cannot recognize images.
If there is a threshold before us, it is not technological.
It is human.
This is an Eden-level event. It is a moment when allegiance shifts before the cost is understood.
Which leaves us with a final question.
Whose voice are we listening to now, and do we understand the magnitude of the moment before us?
