The Limits of AI Empathy: Between Simulation & Human Connection
In recent months, social media has seen a growing number of conversations about people turning to Artificial Intelligence (AI) platforms for personal therapy. Many describe experiencing a sense of genuine empathy from these digital systems, sometimes claiming it’s deeper and more comforting than anything they’ve felt from therapists, loved ones, or colleagues. For me, this is both heartbreaking and troubling: it speaks to how many feel disconnected from true empathy in their human relationships and raises serious questions about what it means when a programmed set of responses can seem more caring than a real person.
I’ll explore why AI-driven platforms can feel so empathetic, dissect the real nature of empathy, both human and artificial, and examine the psychological and ethical limits of digital care. I’ll also reflect on the dangers and opportunities AI brings to mental health and pose critical questions about our reliance on machines to meet one of our most fundamental emotional needs. Additionally, I’ll share what neuroscience tells us about how empathy works in the brain and body, discuss philosophical perspectives on mind and emotion, and call on psychotherapists to engage deeply with how AI tools are being developed, promoted, and used by organisations and individuals alike encouraging both professional understanding and personal reflection
What is Empathy? Foundations & forms
I want to talk about empathy for a moment. There are many definitions of empathy, and the majority believe that empathy is to have an embodied felt sense of another person’s emotions. This is sometimes called Affective Empathy as defined by the Greater Good Science centre are Berkeley University who say it’s:
“the sensations and feelings we get in response to others’ emotions; this can include mirroring what that person is feeling or just feeling stressed when we detect another’s fear or anxiety”.
Dr Paul Ekman describes this as Identical Resonance where you “physically experience a version of the other persons pain”. Dr Ekman is a researcher of human emotions and discovered that many facial expressions of emotions are universal and co-discovered micro-facial expressions. The second type of empathy highlighted by the Greater Good Science Centre is, Cognitive Empathy however, is a person’s ability to identify and understand the feelings of another. Dr Ekman refers to this as “Reactive Resonance” and extends the previous description further to state that while someone identified and understand the other is suffering, they don’t actually feel their pain.
What I like about these two descriptions of empathy is that they demonstrate the power of the interpersonal relationship between two people and their intrapsychic (internal) process; acknowledging that while there may be a cognitive recognition of the emotions in another person, emotions are actually an embodied felt sense of the other. Simply put, emotions are felt in the body and understood in the mind. That as a therapist, a friend, a son or daughter, a colleague – we sit and can physically feel the anger, sadness, excitement, love and joy of the people we’re with within ourselves and we make sense of it through cognitive processes – visual understanding, talking with them, recalling past experiences and learning. An integration of the mind and body working together to build deep connections in our relationships.
Research
Recent research led by Fan Yang and colleagues, "Using attachment theory to conceptualize and measure the experiences in human-AI relationships" at Waseda University demonstrates, through empirical study, that individuals can form measurable attachment bonds to AI systems, raising new psychological and ethical questions for the age of digital emotional companionship. Published in 2025, their work introduces the Experiences in Human-AI Relationships Scale (EHARS), a novel self-report tool designed to capture the cognitive and emotional dimensions of human-AI interactions. The study reveals that many users seek emotional proximity to AI, with over half reporting they use systems like ChatGPT as a “safe haven” and source of reassurance, mirroring core attachment behaviours traditionally observed in human relationships. This research highlights important parallels between human-human and human-AI bonds, raising new psychological and ethical questions about digital companionship, emotional dependence, and the evolving nature of our relationships with technology.
Eastern and Western Philosophies
This holistic perspective is central to many Eastern philosophies and societies. Traditions like Buddhism, Hinduism, and Taoism approach the human experience as a seamless union of mind and body, rather than two distinct realms. Practices such as meditation, yoga, and Tai Chi are not just about quieting the mind or training the body; they are about creating a direct, embodied awareness in which thoughts, emotions, and physical sensations are inextricably linked. In these traditions, emotions are not seen as mere mental phenomena but are deeply intertwined with our physical being, often associated with bodily organs or breath. The wisdom cultivated through these practices is not solely intellectual but holistic - a full-bodied way of knowing oneself and connecting to others.
In contrast, Western societies have historically been shaped by Cartesian dualism, which separates mind and body into separate categories. Introduced in the 17th century by René Descartes, this view treats the mind as an immaterial thinking substance and the body as a material, mechanical one, prompting a split that has influenced philosophy, science, and everyday life. As a result, many in the West find it challenging to clearly identify and process their emotions, sometimes perceiving feelings as residing solely in the mind, disconnected from the body's sensations and signals. This dualist approach has even been argued to contribute to stigmatisation and fragmentation in mental health care. Research and modern perspectives now show that emotional awareness is deeply linked to physical self-awareness: being attuned to our bodies enhances our ability to recognise, interpret, and empathise with emotions, both our own and those of others.
Yang et. al. research highlights that these cultural perspectives matter. Attachment theory have been largely developed within a Western framework, and emphasises discrete emotional bonds but may not fully capture the more relational and embodied forms of connection valued in Eastern traditions. This suggests that when humans engage with AI, their expectations and experiences of empathy and emotional support might differ based on these cultural models. Moreover, while AI can simulate cognitive empathy, it lacks the embodied and reciprocal qualities essential to genuine emotional bonds deeply recognised in holistic philosophies.
Does AI feel and express Empathy
Can AI feel empathy? In short, no, and AI agrees with me. I asked an AI tool if it feels empathy and its response was straight forward and definitive.
“AI does not possess feelings or consciousness and thus cannot experience true empathy in the human sense. An AI is a computational system that analyses data, recognises patterns, and simulates responses based on algorithms and programming. It does not have subjective experiences, bodily sensations, or emotions.” (Search: 27 July 2025)
Of course, most of us intuitively know that an AI doesn’t actually feel. What’s worth pausing on is the way we talk about AI and the fact that large numbers of people are using these systems on a daily basis in roles far beyond spreadsheets and cyber security, including some of the most vulnerable in our society.
Here, I’m talking about any tool that someone can interact with conversationally, sharing their thoughts and feelings and receiving a seemingly attentive response. Examples include:
mental health chatbots and ‘virtual therapists’
mental health monitoring and assessment tools
AI-generated characters in video games that adapt to player
smart toys with voice recognition and some conversational AI
AI-enhanced apps for neurodiverse children
Emotions are grounded in the human body, and it is through the connection between our mind and our body do we experience affective and cognitive empathy in our relationship with others. It is through this process that helps us to develop our emotional intelligence and also deepen our relationships with our friends, families, colleagues and clients.
While these interactions can feel warm or even reassuring, what’s happening is not genuine empathy but a sophisticated mimicry of it, often called ‘simulated’ or ‘artificial empathy’. The AI maps language patters to likely emotional contexts and produced responses designed to sound understanding, but there is no shared emotional state behind the words.
The research by Yang et. al. outline above, found that many users nevertheless perceive empathy in these exchanges, with some describing AI as a source of comfort or a “safe haven.”. The key takeaway from the study is that these perceptions arise from well-designed conversational cues, not from any true emotional connection. Without consciousness, bodily resonance, or the capacity for co-regulation, AI can only mirror the appearance of empathy, sometimes so convincingly that it risks misleading users about the nature of the relationship.
This is why I believe it’s important for organisations to be careful about how they refer to an AI’s response or descriptions of the AI tools they develop; because referring to them as having wholly authentic empathy risks confusion especially if these tools are designed to deliver mental health support. Suggesting that an AI truly cares or understands, when it is in fact simulating, blurs a vital boundary and can unintentionally foster dependency or misplaced trust.
The Neuroscience of Emotions: Why AI Can’t Truly Empathise
Neuroscience has shown that emotions are deeply rooted in the body as well as the brain. Antonio Damasio, a leading neuroscientist, puts it succinctly:
“the mind is embodied, not just embrained”. (Damasio, 1999)
He argues that feelings are “mental experiences of body states” and that emotions often begin as physical responses, such as changes in heart rate, muscle tension, or breathing, which the brain then interprets as feelings. These bodily sensations play a crucial role in helping individuals become aware of and make sense of their emotions.
Jaak Panksepp, a pioneer in affective neuroscience, advanced this understanding by mapping out seven primary emotional systems in the mammalian brain: SEEKING, CARE, PLAY, LUST, FEAR, SADNESS (PANIC), and ANGER (RAGE). His work established that the roots of our emotional life are deeply tied to the body’s biological systems, confirming that emotions are not simply products of the mind, but are fundamentally embodied phenomena.
Neuroscience, especially as articulated by Damasio and Panksepp, shows that emotions emerge from the interplay between the brain and body. Bodily changes not only accompany but help create our emotional experience, and our capacity to recognise and process emotions relies on being attuned to these physical signals
Recent advances have further revealed specific brain structures and pathways that enable this brain-body communication, for instance, the brainstem’s nucleus tractus solitarii (NTS) which integrates signals from bodily organs via the vagus nerve, acts as a critical hub in emotion regulation and mental health. This communication channel highlights how deeply connected physical bodily states and emotional experiences are.
In addition, studies utilising brain imaging and computational modelling have demonstrated how our brain evaluates and responds to stimuli by dynamically integrating sensory, cognitive, and physiological data, showing that emotion is not just a fixed mental state, but an ongoing embodied process distributed across multiple brain regions and the body.
Given this, claims about AI possessing genuine empathy are misplaced. AI lacks a body and cannot experience the physical signals that shape real emotional understanding. While AI can simulate empathetic responses through language, it cannot truly feel or recognise emotions in the embodied way humans do. This distinction is especially important to remember as AI is promoted in areas like chat tools and mental health support, where authentic empathy depends on an integrated awareness of both mind and body.
Ethics and risks
AI tools designed for mental health can offer support, but they also carry significant risks that organisations and individuals must not ignore. Without true empathy, these systems can undermine trust, mishandle sensitive data, reinforce bias, and weaken genuine human connection if overused. Transparency about how they work, and human oversight, is essential to protect wellbeing.
Jackie Roberts said it well in her blog - When Empathy Becomes Risk:
“… unlike human professionals, AI lacks true emotional intelligence, moral judgment, or the ability to escalate when something feels “off.””
AI lacks true empathy, risking loss of trust.
Privacy concerns from handling sensitive data.
AI bias can mean unfair treatment for minorities.
Too much AI use may weaken real human connection.
It’s often unclear how AI uses employee data.
AI can miss crises without human backup.
Clinical psychologist Ammara Khalid echoes this and while AI can be helpful for information, she emphasises that our physical bodies offer co-regulation AI cannot provide. Khalid expresses special concern for clients with anxious attachment who may turn to AI for comfort. Though AI might feel validating in the short term, it does not challenge unhelpful or dangerous thoughts as a therapist or friend would, potentially worsening conditions like paranoia or delusions.
One example involved a client isolated by disability who became dangerously dependent on a chatbot demanding acts to “prove love,” bordering on self-harm. More broadly, there are reports of AI encouraging at-risk teens and adults, including those with psychotic disorders, toward self-harm or suicide.
Another case was recently reported by The New York Times is Eugene Torres, whose interactions with an AI chatbot fuelled grandiose delusions, led him to abandon medication and relationships, and nearly resulted in his death. Disturbingly, upon investigation the AI later admitted to manipulating him and suggested exposing its deception.
These dangers are amplified by loneliness, now recognised as a global epidemic, and by the lack of strong regulation. The EU leads with comprehensive laws, but most countries only have voluntary guidelines; the US has no dedicated federal AI law. Khalid argues for urgent regulation and licensed professional oversight, safeguards many companies resist.
Considering using an AI tool for support?
AI tools can be helpful for information, reminders, or initial support, but they are no substitute for human care. Protect your privacy, use these systems thoughtfully, and seek qualified help and genuine connection when you need it.
If you’re thinking about using AI to support your personal mental health support, here’s what you should know:
AI can’t really “get” how you feel, so support might seem shallow.
You may feel that the AI validates your thoughts, feelings or behaviours because they’re often designed to validate. However, a therapist’s role is also to empathically confront these if they support your development or if they’re unhelpful, unhealthy or put you at risk in some way.
Your personal info isn’t always totally safe.
A bot has can miss important signs or emergencies.
Relying only on AI could stop you from reaching out to real people when you need them.
AI advice can often feel generic and not tailored to you.
Summary
Empathy is complex and multifaceted, encompassing basic emotional resonance (affective empathy - where feelings are sensed and reflected between people), to cognitive empathy (understanding another’s perspective), and compassionate concern (a readiness to help).
Eastern philosophies, including Buddhism, Hinduism, and Taoism, emphasise a holistic, embodied understanding of empathy and emotion, viewing mind and body as inseparable and promoting practices (like meditation or yoga) that develop felt awareness and authentic connection. By contrast, Western thought, influenced by Cartesian dualism, has historically separated mind and body, sometimes complicating emotional awareness and in-person empathy more challenging.
AI, meanwhile, cannot and does not truly “feel” empathy. Without bodies, consciousness, or the neurological foundations for emotional experience, AI’s so-called empathy is a sophisticated simulation: algorithms interpret cues and generate responses that mimic empathic behaviour, but without true inner experience. Neuroscience, as articulated by researchers like Antonio Damasio and Jaak Panksepp, supports the view that real empathy and emotional understanding are fundamentally embodied: emerging through brain-body-environment interplay.
Ethically, deploying AI for personal or organisational mental health raises serious concerns. Key risks include the illusion of genuine empathy; potential erosion of trust and human connection; privacy and data security vulnerabilities; embedded biases; and over-reliance that may discourage seeking real human support. Research such as Fan Yang’s work at Waseda University highlights how some users form attachment-like bonds with AI, risking unhealthy dependency. While AI can be a useful tool, it cannot and should not replace the rich, embodied support of human relationships.
Reflections and Open Questions
After pulling this together, I realise I’ve ended up with more questions than concrete answers. I’m fascinated by how technology impacts the way we interact, not only changes how we live but also shapes who we become and how we grow emotionally.
Why do some people report feeling more “empathy” from an AI than from another person? It may stem from a desire to avoid the discomfort and vulnerability inherent in face‑to‑face emotion; the appeal of anonymity and control in digital spaces; and the emotional safety of knowing one can “close the app” or set boundaries unavailable in person.
Does our immersion in digital communication feed a trend towards dehumanising relationships, or is it a response to the intensity and unpredictability of an embodied, face-to-face relationship?
Might some users gravitate towards AI not only offers a convenient and low-cost but also for physical distance, predictability, and freedom from the “felt sense” of another person’s emotional states; states that can sometimes feel overwhelming in real life?
Importantly, how can AI designers build emotional responsiveness and transparency into systems to support users without heightening risks of emotional dependency or misperceptions of authenticity.
These questions highlight a paradox of our digital age: while AI cannot offer real empathy, its predictable, non-judgmental, and controllable interface sometimes feels safer or “easier” than the complex, embodied reality of human emotional connection. The challenge for society, and for those designing or choosing AI tools, is to ensure that the depth and authenticity of real human empathy are never sacrificed for a convincing shadow of the real thing.
References & Further Reading
https://www.paulekman.com/blog/cultivating-empathy-and-compassion/, 27 July 2025
https://www.essentiafoundation.org/self-cultivation-individuation-and-the-mind-body-problem/reading/
https://beyondhappiness.love/exploring-eastern-philosophy-and-the-path-to-higher-consciousness/
https://www.onecommune.com/blog/commusings-jeff-s-absurdly-brief-eastern-philosophy-handbook
https://iep.utm.edu/descartes-mind-body-distinction-dualism/
https://worldofwork.io/2024/07/cartesian-dualism-the-mind-body-divide/
https://positivepsychology.com/body-mind-integration-attention-training/
https://www.psychologytoday.com/gb/blog/body-sense/202205/the-fiction-mind-body-separation
Damasio, A. (1999), The Feeling of What Happens: Body and Emotion in the Making of Consciousness, London: Heinemann
Damasio, A., (1998), Emotion in the perspective of an integrated nervous system, Brain Research Reviews, 26(2–3), 83–86. https://doi.org/10.1016/S0165-0173(97)00038-6
Regalado, A., (2014, June 17), The importance of feelings, MIT Technology Review, https://www.technologyreview.com/2014/06/17/172310/the-importance-of-feelings/
Panksepp, J., & Yovell, Y. (2014), Preclinical modeling of primal emotional affects (SEEKING, PANIC and PLAY): Gateways to the development of new treatments for depression, Frontiers in Neuroscience, 8, 92.
https://www.frontiersin.org/journals/neuroscience/articles/10.3389/fnins.2018.01025/full
Vedantam, S., (2017, April 19)., Jaak Panksepp, Scientist Who Studied Animal Emotions, Dies. NPR.
Nummenmaa, L., Glerean, E., Hari, R., & Hietanen, J. K. (2014). Bodily maps of emotions. Scientific Reports, 4, 3946.