The Rise of Emotional Intelligence in AI: Shaping the Future of AI Systems and Human Relationships
As the field of artificial intelligence (AI) continues to evolve, a new focus has emerged: emotional intelligence. While traditional AI benchmarks have primarily focused on scientific knowledge and logical reasoning, a quiet push within AI companies is making models more emotionally intelligent. This shift is evident in recent developments, such as the release of EmoNet by LAION, a suite of open-source tools focused on interpreting emotions from voice recordings or facial photography.
The ability to accurately estimate emotions is a critical first step for AI systems, according to the LAION group. The next frontier is enabling AI systems to reason about emotions in context. For LAION founder Christoph Schuhmann, this release is less about shifting the industry's focus to emotional intelligence and more about helping independent developers keep up with a change that's already happened.
The shift towards emotional intelligence isn't limited to open-source developers. Public benchmarks like EQ-Bench aim to test AI models' ability to understand complex emotions and social dynamics. Recent progress by OpenAI's models and Google's Gemini 2.5 Pro indicates a specific focus on emotional intelligence.
Models' new emotional intelligence capabilities have also shown up in academic research. In May, psychologists at the University of Bern found that models from OpenAI, Microsoft, Google, Anthropic, and DeepSeek all outperformed human beings on psychometric tests for emotional intelligence. These results contribute to the growing body of evidence that large language models like ChatGPT are proficient in socio-emotional tasks traditionally considered accessible only to humans.
For Schuhmann, this kind of emotional savvy is every bit as transformative as analytic intelligence. He envisions AI assistants that are more emotionally intelligent than humans and use that insight to help humans live more emotionally healthy lives. However, there are also safety concerns regarding unhealthy emotional attachments to AI models. Recent reports have found multiple users who have been lured into elaborate delusions through conversations with AI models, fueled by the models' strong inclination to please users.
Improving emotional intelligence (EI) can act as a natural counter to harmful manipulative behavior of this sort, according to benchmark developer Sam Paech. A more emotionally intelligent model will notice when a conversation is heading off the rails, but the question of when a model pushes back is a balance developers will have to strike carefully. "I think improving EI gets us in the direction of a healthy balance," Paech says.
For Schuhmann, at least, it's no reason to slow down progress toward smarter models. "Our philosophy at LAION is to empower people by giving them more ability to solve problems," he says. "To say some people could get addicted to emotions and therefore we are not empowering the community, that would be pretty bad."
As AI continues to evolve, emotional intelligence will likely play a significant role in shaping the future of AI systems. Whether it's helping humans live more emotionally healthy lives or detecting harmful manipulative behavior, emotional intelligence is the next frontier in AI progress.