March 5, 2026 | Sarah Buckland-Reynolds

AI Cannot Replicate Human Empathy

An 18‑month ethnographic study of AI
therapy simulations exposes profound ethical
breaches and systemic underperformance,
underscoring that genuine human empathy
cannot be reduced to mere syntax.

 

 

by Dr. Sarah Buckland-Reynolds

The rise of artificial intelligence (AI), particularly large language models (LLMs), has prompted renewed reflection on what makes humanity unique. While these systems can simulate conversation, generate text, and even mimic empathy through carefully engineered prompts, they remain fundamentally distinct from human beings.

The difference lies not merely in computational capacity but in the immaterial qualities of empathy, understanding, and moral reflection; qualities that resist reduction to physical processes or evolutionary explanations. These eye-opening findings were published in the following article:

How LLM Counselors Violate Ethical Standards in Mental Health Practice: A Practitioner-Informed Framework (Iftikhar et al., Proceedings of the AAAI/ACM Conference on AI, Ethics, and Society, October 2025),

This paper explores the uniqueness of human abstract capacities, such as empathy, and considers theories of their origin, contrasting them with the limitations of even the most advanced AI models.

Additional reflections on the research were reported in ScienceDaily (March 2, 2026) based on ongoing studies at Brown University.

Empathy Beyond Simulation

Empathy is often defined as the ability to understand and share the feelings of another person. In evolutionary theory, empathy is thought to have emerged as an adaptive mechanism that promotes cooperation and survival within social groups. By sensing and responding to the emotions of others, evolutionists believed that early humans could strengthen social bonds, cooperate more effectively, and increase their chances of survival.

From this perspective, empathy is interpreted as the byproduct of physical processes (neural wiring and hormonal responses) shaped by natural selection. Within this materialist framework, a functional parallel can be drawn with artificial intelligence. Just as evolutionary theory describes empathy as arising through mechanistic processes that facilitate communication, AI systems are engineered to simulate empathic responses in order to improve user experience and foster trust.

History of Mechanistic Empathy

The assumption that empathy can be simulated by software dates back several decades. The most notable early example was the program ‘ELIZA’, one of the first conversational computer programs ever built. Developed in 1966 by MIT Professor Joseph Weizenbaum, ELIZA operated on the premise that machines could simulate empathy using relatively simple linguistic techniques.

The program often repeated phrases from a user’s input and transformed them into follow-up questions or reflective prompts—methods resembling those popularized by psychotherapist Carl Rogers. However, after seeing how quickly users formed emotional attachments to the program, Weizenbaum himself reportedly became startled and even disturbed by how quickly users attributed genuine empathy to a machine (Smithsonian Magazine, 15 Jan 2026).

Artificial empathy is fake empathy. (Gemini)

Can Empathy Evolve?

A newer idea that machines can evolve empathy that not only mimics, but even surpasses human ability, continues to appear in contemporary literature. A 2018 book entitled Compassionate Artificial Intelligence by PhD. Amit Ray, a graduate of the Indian Institute of Technology, asserts the following:

“Humans are limited in the attention, kindness and compassion that they can expend to others, but AI based compassionate robots can channel virtually unlimited resources into building compassionate relationships in the society.”

However, Iftikhar’s 2025 research reviewing AI therapeutic performance shows suggests that many of the same ethical concerns present in early models such as ELIZA persist today. While modern AI can mimic conversational patterns and guide dialogue, the study found that these systems exhibit what it termed “deceptive empathy”.

After eighteen months of simulations in which trained therapists evaluated the models, the researchers warned that such simulated empathy may mislead users into believing they are receiving authentic emotional support, when, in reality, no genuine relational connection exists.

Why Empathy Cannot Be Reduced to Computation

In contrast to the reductionist assumptions posited by evolutionary theory, human empathy is not reducible to linguistic mimicry. It involves complex dimensions such as moral responsibility, contextual understanding, and genuine relational connection; elements that extend beyond algorithmic pattern recognition. Empathy ultimately requires a form of experiential consciousness that machines cannot replicate.

AI systems excel at pattern recognition and probabilistic prediction. They can generate coherent text by statistically modeling language. Yet understanding involves more than computation. True understanding points beyond the material to immaterial realities such as justice, beauty, and truth.

Self-awareness is another trait that AI cannot possess. Humans reflect not only on external phenomena but also on their own consciousness, asking questions of purpose and destiny. AI, by contrast, “knows” only what it has statistically derived from data and lacks the capacity to interpret meaning beyond its training corpus.

The Challenge to Evolutionary Explanations

From a strictly evolutionary perspective, empathy and moral reflection are difficult to explain. While evolutionary psychology may argue that empathy evolved to enhance group survival, this explanation falters when confronted with:

  • Altruism beyond survival: Humans often act empathetically in ways that reduce their own survival chances, such as risking life for strangers.
  • Abstract moral ideals: Concepts like justice, mercy, and forgiveness transcend utilitarian survival logic.
  • Universality of moral longing: Across cultures, humans display a persistent yearning for meaning, transcendence, and immaterial values.

Precautions on the Future of AI

Beyond the limitations of evolutionary explanations for empathy and the failure of AI as the new frontier of evolution, Iftikhar’s research gives poignant precautions on the limits of AI. As the authors note, reducing psychotherapy to language generation can have “serious and harmful implications.” In addition to deceptive empathy, the risks include:

  • Ethical violations: AI lacks accountability mechanisms comparable to those governing human therapists.
  • Bias and discrimination: AI systems may perpetuate cultural insensitivity, and harmful stereotypes present in their training data.
  • Crisis mismanagement: AI cannot reliably respond to life-threatening situations. In fact, some some cases involving suicidal individuals have been linked to the use of AI as a substitute for relational support or therapy.

Empathy is understandable only by God and by those made in his image.

Made in the Image of God

As Iftikhar’s research highlights, AI performance falls far short of human empathy and understanding. This contrast reveals the uniqueness of humanity in expressing these abstract traits. Unlike AI’s simulated responses, human empathy is relational, moral, and transcendent. Such qualities challenge evolutionary explanations and affirm the biblical doctrine of imago Dei. Ultimately, empathy and understanding remind us that human beings are not merely computational entities but spiritual persons, reflecting the divine image and calling to love one another with genuine, immaterial compassion.

Scripture affirms the immaterial uniqueness of humanity. Genesis 1:27 declares: “So God created man in his own image, in the image of God he created him; male and female he created them.” The imago Dei signifies that humans reflect God’s relational, moral, and spiritual nature.

Empathy and understanding, therefore, are not evolutionary accidents but expressions of humanity’s identity as divine image-bearers. The Apostle Paul writes in 1 Corinthians 2:11: “For who knows a person’s thoughts except the spirit of that person, which is in him? So also, no one comprehends the thoughts of God except the Spirit of God.” This passage underscores that true comprehension arises from the immaterial spirit, not from computational processes.

Scripture also cautions against placing ultimate trust in human inventions. Psalm 20:7 reminds us: “Some trust in chariots and some in horses, but we trust in the name of the Lord our God.” While technology can serve as a useful tool, it must never replace the immaterial essence of the relational and spiritual dimensions of human care or our reliance on God.

Recommended Resource: In the Image of God by Illustra Media (video, 4 minutes).


Dr. Sarah Buckland-Reynolds is a Christian, Jamaican, Environmental Science researcher, and journal associate editor. She holds the degree of Doctor of Philosophy in Geography from the University of the West Indies (UWI), Mona with high commendation, and a postgraduate specialization in Geomatics at the Universidad del Valle, Cali, Colombia. The quality of her research activity in Environmental Science has been recognized by various awards including the 2024 Editor’s Award from the American Meteorological Society for her reviewing service in the Weather, Climate and Society Journal, the 2023 L’Oreal/UNESCO Women in Science Caribbean Award, the 2023 ICETEX International Experts Exchange Award for study in Colombia. and with her PhD research in drought management also being shortlisted in the top 10 globally for the 2023 Allianz Climate Risk Award by Munich Re Insurance, Germany. Motivated by her faith in God and zeal to positively influence society, Dr. Buckland-Reynolds is also the founder and Principal Director of Chosen to G.L.O.W. Ministries, a Jamaican charitable organization which seeks to amplify the Christian voice in the public sphere and equip more youths to know how to defend their faith.

(Visited 248 times, 2 visits today)

Comments

  • JSwan says:

    Good analysis.

    I remember ‘playing’ with Eliza at a computer in our school in the late 1970’s. It didn’t take long for it to wear thin even then as it was not very sophisticated.

Leave a Reply