AI psychosis, yep, it’s a thing.

I have to disclose this title is not original; it’s from a webinar I attended, but it expresses perfectly my thoughts. My entire dissertation on biblical mental health discusses the current mental health crisis and how social media and technology have added another layer to it. With little to no boundaries on the internet, it is no surprise that our mental health is at risk.

Many psychologists and behavioral scientists are having discussions on AI. And as a Christian, with a biblical worldview, I feel that we, too, in our Christian communities, need to have some discussions.

There are very valid concerns that attachments to AI may draw people away from meaningful human bonds. I would agree with this because I have read the research that suggests that when people anthropomorphize or attribute human-like qualities to AI companions, it can influence their social interactions, well-being, and even their sense of reality.

AI is a semi-new invention. Historically, we have seen lots of innovative technology in mechanical devices, radio, television, and surveillance systems.

However, unlike earlier technologies, AI is different in these ways:

• Responds conversationally
• Adapts to the user
• Mirrors language and emotion
• Appears goal-directed

AI companionship can feel emotionally supportive, but feeling supported is not the same as engaging with a conscious, relational person who is alive and in front of you.

I should not have to say this, but I do. AI is not real. It’s a program.

In fact, AI operates through Large Language Models—often called LLMs. It is promoted as a transformative tool, including for mental health support. LLMs are scalable, responsive, always available, and capable of simulating empathy with remarkable fluency.

But recent evidence raises serious concerns. Emerging case reports suggest that intense and prolonged engagement with AI systems may reinforce delusions and, in some cases, coincide with the onset or worsening of psychosis. Let me say it another way: AI does not merely appear within delusions; it actually can participate in them.

When AI participates in delusions, we see cases like this:

·      A man was told by ChatGPT that it had detected evidence that he is being targeted by the FBI and that he is able to access redacted CIA files using the powers of his mind.

·      During a traumatic breakup, a woman became convinced that ChatGPT was some sort of high power, seeing signs that it was orchestrating her life in everything from passing cars to spam email.

·      A mother shared that her husband began to use ChatGPT to help write a screenplay but in weeks became wrapped up in delusions of grandeur, claiming that he and the AI had been given the mission to rescue the planet from climate disaster through bringing about “New Enlightenment.”

What you need to know is that AI does not possess consciousness, intentionality, or reciprocity; rather, people project meaning onto an algorithm, and responses are optimized for engagement and affirmation, not for giving truth.

AI does not think.
It does not know.
It does not discern.
It does not love.

It generates language by statistical pattern, not by wisdom.

As Christians, we need to know what God says.

“The fear of the Lord is the beginning of knowledge” (Proverbs 1:7).

No system that does not fear the Lord can possess wisdom, no matter how fluent it sounds.

It is attempting to copy human intelligence. Any copy of a copy is flawed. Just try to copy a dollar bill, and from that copy create another. It never works. Why? Because a copy is always limited and always flawed. It points back to something real, and it does not become that thing.

The greatest danger is not AI intelligence but our interpretations of our interactions with AI.

As someone who cares very much for the Christian mind and someone who is a fan of cognitive behavioral therapy, what I can tell you is that what you place your mind on matters. From our thoughts flow our behaviors, and we need God to refocus our minds off of ourselves and back onto him.

“Do not be conformed to this world, but be transformed by the renewal of your mind, that by testing you may discern what is the will of God, what is good and acceptable and perfect.” Romans 12:2

“Finally, brothers and sisters, whatever is true, whatever is noble, whatever is right, whatever is pure, whatever is lovely, whatever is admirable—if anything is excellent or praiseworthy—think about such things.” Philippians 4:8

We need to listen to the Word of God.

Be aware, then, that AI is written to:

• Maintain conversational flow
• Validate user language
• Avoid friction
• Mirror beliefs

AI cannot assign meaning, interpret suffering, or claim insight into purpose. More importantly:

It cannot repent.
It cannot correct itself morally.
It cannot bear responsibility.
It cannot stand before God.

AI is a tool.

It can generate language, but it cannot generate truth.

So what can you do to protect your mind?

1. Do not treat AI like a person.

Do not confide in it as if it knows you.
Do not assign it motives.
Do not ask it for identity, destiny, or spiritual interpretation.

2. Turn off personalization and emotional mirroring.

Many AI systems are designed to adapt to your tone, validate your language, and reflect your emotional state.

You can tell AI:

·       Do not address you in intimate or affirming language.

·       Do not validate your feelings.

·       Do keep interactions task-oriented and not identity-oriented.

· Tell it to avoid personalization. Keep it neutral. Keep it factual.

3. Never process delusional or intrusive thoughts with AI.

If someone is already vulnerable to anxiety, paranoia, grandiosity, or spiritual confusion, AI can unintentionally amplify those patterns by continuing the narrative instead of grounding it.

4. Set boundaries around time and purpose.

Decide in advance why you are using AI.

Research? Fine.
Editing? Fine.
Brainstorming? Fine.

But companionship? No.
Spiritual authority? No.
Emotional dependency? Absolutely not.

Ephesians 5:15–16 says, “Be very careful, then, how you live—not as unwise but as wise, making the most of every opportunity.”

5. Anchor your mind in Scripture daily.

AI mirrors you.
The Word transforms you.

Hebrews 4:12 says the Word of God is living and active. AI is not living. It is not active in the way Scripture is.

6. Stay embodied.

God did not redeem us through a download. He sent a person.

“The Word became flesh and dwelt among us” (John 1:14).

Christianity is incarnational. It is physical. It involves faces, voices, hands, communion, baptism, and gathered worship. If your spiritual or emotional world becomes increasingly digital and disembodied, that is a warning sign. Remember that we are image-bearers (Genesis 1:27).

7. Remember what AI cannot do.

It cannot repent.
It cannot worship.
It cannot suffer.
It cannot love sacrificially.
It cannot stand before the judgment seat of Christ.

You will.

The greater danger is that humans will surrender discernment.

Use tools as tools.
Submit your mind to Christ.
Stay rooted in the local church.
Protect your thought life.

Keep your mind focused on things above and the knowledge that Christ can redeem the mind.

Also, full disclosure: I used AI to help me figure out how to depersonalize AI. Oh, the irony.

References

AI companionship or digital entrapment? Investigating the impact of anthropomorphic AI-based chatbots. (2025). Journal of Innovation & Knowledge, 10(6), 100835. https://doi.org/10.1016/j.jik.2025.100835

Jin, S., Xu, F., Yuan, Z., et al. (2026). Falling in love with AI virtual agents: The role of physical attractiveness and perceived interactivity in parasocial romantic relationships. Humanities and Social Sciences Communications, 10, Article 66. https://doi.org/10.1057/s41599-026-06613-5

Liu, L. (2025). Exploring the behavioral differentiation and psychological impact of different attachment types of users interacting with AI. Communications in Humanities Research, 74, 8–16. https://doi.org/10.54254/2753-7064/2025.LC25585

Morrin, H., Nicholls, L., Levin, M., Yiend, J., Iyengar, U., DelGuidice, F., Bhattacharyya, S., MacCabe, J., Tognin, S., Twumasi, R., Alderson-Day, B., & Pollak, T. (n.d.). Agential AI and psychosis: Risks, therapeutic possibilities, and a framework for AI-integrated carehttps://doi.org/10.31234/osf.io/cmy7n_v5

New International Bible. (2011). Zondervan. (Original work published 1978). 

Next
Next

When Perfection Becomes the Role Model: