Engineering Empathy: How XR and AI Can Reconnect Us

9/3/2025 Ashley Sims

Written by Ashley Sims

Industrial and Enterprise Systems Engineering Professors Roy Dong, Caroline Cao and Avinash Gupta are leading a bold initiative that puts empathy at the center of technical innovation. The project is leveraging extended reality (XR) and artificial intelligence (AI) to create immersive, adaptive training environments that prepare people in high-stakes professions to respond with greater awareness, compassion and clarity. 

Graduate student Tian Sun immerses himself in empathy in engineering training using a VR application at the JUMP Simulation Center at Everitt Lab on Aug. 4, 2025.
Photo Credit: Heather Coit
Graduate student Tian Sun immerses himself in empathy in engineering training using a VR application at the JUMP Simulation Center at Everitt Lab on Aug. 4, 2025.

In the training environments being developed, “we’re not just teaching facts or procedures,” Cao says. “We’re using technology to teach what it means to be human.” 

Redefining Technology’s Role in Human Connection 

Psychologists and educators have long studied empathy, and some tech companies have explored the emotional intelligence of AI. But in traditional engineering disciplines, the primary focus is usually on optimizing physical systems, such as bridges that won’t collapse, engines that run efficiently, or networks that process data quickly. 

This work applies the systematic approaches of engineering to something far more elusive: human emotional intelligence. The research team is asking, in effect, whether it’s possible to engineer empathy the same way one can engineer a building. 

While related work has been done at, for example, Stanford and Carnegie Mellon, those efforts typically emerged from psychology or design departments. What sets the Illinois work apart is its roots in industrial and systems engineering, such that tools like process optimization, human factors analysis and systems thinking are brought to bear on emotional training. 

The XR and AI training systems developed by this team are designed to show how immersive technology can foster human understanding rather than replace it. Unlike the traditional digital interfaces that often create barriers between people, these immersive environments are designed to break down walls, allowing trainees to step into another person’s experience and perspective. 

Kyrian Liang (undergraduate) foreground left, uses a VR application for empathy in engineering training as he is joined by, from left, Jackson Song (undergraduate), Tian Sun (graduate) and Duo Wang (graduate) at the JUMP Simulation Center at Everitt Lab on Aug. 4, 2025.
Photo Credit: Heather Coit
Kyrian Liang (undergraduate) foreground left, uses a VR application for empathy in engineering training as he is joined by, from left, Jackson Song (undergraduate), Tian Sun (graduate) and Duo Wang (graduate) at the JUMP Simulation Center at Everitt Lab on Aug. 4, 2025.

The approach has broader implications beyond training. It suggests a new design principle for all human-facing technologies: instead of asking only how systems could be made more efficient, engineers could also be asking how to make them more empathetic.

Why Empathy and Why Now?

Conversations are underway around the world about the ethical, legal and emotional implications of technological progress. In our increasingly automated world, human interaction is often filtered through digital interfaces such as telehealth sessions, AI-powered assessments, virtual classrooms and bodycams. The risk of emotional detachment is growing. 

Yet in the aftermath of COVID-19, the need for real, embodied empathy has never been more visible. Frontline workers have faced unprecedented emotional labor; burnout and psychological fatigue are pervasive across the healthcare and law enforcement communities. At the same time, social movements have challenged institutions to address concerns about systemic inequality, bias and mistrust. 

In this volatile mix, empathy is no longer an optional soft skill, but a core competency needed to support resilience, trust and decision-making. Empathy directly impacts outcomes, particularly in fields like healthcare and law enforcement, in which interactions may carry emotional weight or escalate into a crisis. For example, a provider’s lack of empathy can compromise patient adherence and worsen outcomes; in law enforcement, it can result in unnecessary use of force, misjudged risk or tragic errors. 

An empathic response, on the other hand, can build rapport, defuse tension and even save lives. 

“Empathy is critical in so many settings. It can help doctors provide better care for patients, and it can help police officers de-escalate situations more successfully,” says project co-principal investigator Roy Dong, who is an assistant professor in industrial & enterprise systems engineering and electrical & computer engineering. “Yet we don’t have any rigorous framework to develop and train it. Is it a feeling? Is it a skill? When does exercising empathy strengthen it, like working out a muscle, and when does it weaken it, causing burnout? 

“We’re seeking to form this foundation for understanding empathy and operationalize this understanding of empathy, and also hoping that XR and AI can help make this training scalable and sustainable. Already, thanks to [a] great collaboration with the [University of Illinois] Police Training Institute, we’ve seen that the XR+AI components have allowed police officers in training to try different approaches to de-escalation over and over, far beyond the limited practice that was possible when other people had to be involved in the role play.”

“Empathy is critical in so many settings. It can help doctors provide better care for patients, and it can help police officers de-escalate situations more successfully,”

Roy Dong, Co-Principal Investigator, Industrial & Enterprise Systems Engineering and Electrical & Computer Engineering.

Empathy as Professional Self-Care

While better outcomes for patients and community members are an obvious benefit of professionals’ empathy, the professionals benefit, too. Low morale is a persistent and sobering problem in both healthcare and law enforcement. When workers feel emotionally disconnected from those they serve, it has a psychological toll that manifests as burnout, cynicism and, ultimately, workforce attrition. 

Empathy training can be thought of as a form of professional resilience building. By helping trainees develop stronger emotional reflexes and perspective-taking skills, they’re building psychological armor against the demoralizing effects of high-stress, high-stakes work. 

This is the moment, Dong says, to rethink not only how we teach but what we prioritize. “If empathy is essential to public trust, emotional resilience and equitable systems, then it deserves the same rigor and innovation we apply to every other form of training. And this rigor is what can allow us to incorporate XR and AI, so that one can practice in a variety of settings as many times as desired.” 

XR can be particularly useful in this pursuit. Current training methods that rely on lectures, manuals, and even role play fall short of conveying the intensity and unpredictability of real-life interactions. In contrast, immersive AI-powered XR environments offer a promising avenue for simulating the emotional and cognitive dimensions of complex encounters. 

Training Through Experience

Using the research team’s approach, trainees step into fully immersive XR simulations that replicate complex real-world scenarios, such as a tense patient interaction in an emergency room or a community policing situation on the street. The system is adaptive, responding in real time to the trainee’s choices, body language and verbal tone. 

The result? Highly personalized training that challenges users emotionally as well as intellectually. 

“What is empathy? Every researcher on this project will give you a different answer,” says Dong, who is a researcher in Grainger Engineering’s Coordinated Science Laboratory. “But most of us agreed that you can often tell when someone is being empathetic. How does empathy manifest, behaviorally? Looking at a person’s words and actions, can we identify it? Quantify it? Train and improve it?”

The XR+AI systems will serve as safe, high-fidelity environments in which participants can make mistakes, reflect and try again. Over time, they build emotional reflexes that can be deployed under pressure in the real world. 

Imagine a scenario in which an officer responds to a noise complaint in a residential neighborhood. In traditional training, an officer might approach the abstract situation with an attitude of authority and a plan to issue blunt commands. But in the XR simulation, the situation is nuanced: a young person is experiencing a mental health crisis. 

Insert caption here
Photo Credit: Heather Coit
Kyrian Liang (undergraduate) foreground left, uses a VR application for empathy in engineering training at the JUMP Simulation Center at Everitt Lab on Aug. 4, 2025.

The system tracks the officer’s tone, posture and proximity. When the officer lowers his or her voice, maintains nonthreatening body language, and uses empathetic language, such as “Can you tell me what’s going on?”, the AI-generated individual de-escalates. The simulation can rewind to allow the officer to try different approaches, learning how subtle choices affect the outcome. This before-and-after experience helps build emotional awareness and reinforces the trainee’s awareness that empathy is a critical tool in protecting lives and building trust. 

This systems-based approach, treating empathy as a measurable and improvable skill, sets this work apart from traditional empathy training approaches. Where others might rely on intuition or qualitative feedback, this team is building quantitative frameworks to track emotional responses, measure perspective-taking accuracy, and optimize training protocols based on performance data. 

A Vision: XAI4ET

The research is part of the XAI4ET (XR + AI for Empathy Training) project, which is supported by a Grainger Engineering Strategic Research Initiative (SRI) grant. Funding for Phase 2 was received earlier this year to support work through April 2026. While the project’s vision is to address pressing social challenges like fostering social connectedness in cyber-human systems, strengthening empathy in critical situations and improving decision-making in non-critical contexts, the long-term goal is a center that is part think tank, part design studio, and part research lab, where engineers, educators, behavioral scientists and frontline professionals come together to build human-centered technologies for complex social systems. 

Caroline Cao, center, Industrial and Enterprise Systems Engineering professor, poses with graduate students Tian Sun , left, and Kyrian Liang at the JUMP Simulation Center.
Photo Credit: Heather Coit
Caroline Cao, center, Industrial and Enterprise Systems Engineering adjunct professor, poses with graduate students Tian Sun , left, and Kyrian Liang at the JUMP Simulation Center.

“Empathy is especially critical in high-intensity service professions,” Cao says. “We’re building tools that teach people to listen better, see more clearly, and act with greater understanding—because lives depend on it.” 

While the future center’s focus will be on healthcare and law enforcement, the team aims to lay the groundwork for wider application in the future. 

The Power of Cross-Disciplinary Collaboration

XAI4ET’s work is founded on the insight that empathy training is inherently interdisciplinary: engineers can’t do it alone. A collaboration among engineers, educators, behavioral scientists and frontline professionals isn’t just a nice touch—it’s methodologically essential. 

In addition to Cao, Dong and Gupta, the team includes researchers with expertise in art and design; computing and data science; education; fine and applied arts; information science; media; and more.  

The group has established partnerships with the University of Illinois Police Training Institute and Carle Illinois College of Medicine, and is looking to form partnerships with hospitals and other public health agencies in the future, creating a pathway for real-word deployment. 

“We know the importance of using evidence-driven training for police recruits, and by working with the researchers in this project, we can help facilitate discussions among police leadership, officers and recruits in a way that brings a real-world perspective,” said Joe Gallo, director of the Police Training Institute. “It’s not just about building tech. It’s about designing learning experiences for our recruits that are emotionally meaningful.” 

Duo Wang (graduate student) uses the white board to communicate with bottom L-R, Tian Sun, Jackson Song and Kyrian Liang in their empathyin engineering control center at the JUMP Simulation Center at Everitt Lab on Aug. 4, 2025.
Photo Credit: Heather Coit
Duo Wang (graduate student) uses the white board to communicate with bottom L-R, Tian Sun, Jackson Song and Kyrian Liang in their empathyin engineering control center at the JUMP Simulation Center at Everitt Lab on Aug. 4, 2025.

For Dong, the goal is to ensure that humaneness is preserved in systems that are increasingly machine-centric.

“In engineering design, we can define specifications and make it faster, smaller, more ergonomic, more accurate and precise, more accessible,” says Dong. “We’re seeking to understand empathy from a design perspective: what does empathy look like on a spec sheet?”

Meet the Faculty

Roy Dong

Roy Dong

Assistant Professor

Avinash Gupta

Avinash Gupta

Teaching Assistant Professor

Caroline Cao

Caroline Cao

Adjunct Professor for Industrial and Enterprise Systems Engineering and Dean of the Faculty of Engineering at the University of Ottawa (starting Fall '25)


Share this story

This story was published September 3, 2025.