01 logo

How Well Does AI Really Handle Your Human Emotions?

Exploring the emotional integrity of our new digital counterparts.

By E.B. Johnson Published 11 months ago 10 min read
Like
How Well Does AI Really Handle Your Human Emotions?
Photo by Aideal Hwa on Unsplash

In the olden days (2 decades ago) we used to take our problems to public columnists. Writers like Dr. Joyce fielded questions from across the globe all of which came from frustrated, heartbroken, lonely souls who needed answers to their most painful questions. We called these columnists "agony aunts", and they helped many of us learn valuable lessons and skills in terms of our human relationships and our big emotional upsets.

Now, those advice columns are long gone, replaced by therapists, bloggers, and influencers who provide even more nuanced and complicated advice across topics like sex, gender, relationships, mental health, and personal development.

The day is coming for these providers now too, however. As artificial intelligence continues to birth tools like ChatGPT, society now has to contend with a new competitor in the realm of soothing and resolving human emotions.

Will chat AI become our new agony aunts? Will they become the outside emotional support that ushers us into a new age of artificially generated and algorithmic advice? Only one thing is for certain…artificial intelligence as we know it is going to shift the way we see and deal with human emotions.

How does chat AI handle human emotions?

Last year, a senior engineer at Google claimed that the AI they were developing had become sentimental. The engineer had interacted extensively with the AI and described it as having the emotional intelligence of a 7 or 8-year-old child. The story threw the internet into an instant tailspin.

Is artificial intelligence becoming sentient? It was the first question many asked, but then came the inevitable next question…can our AI really feel human emotion?

Artificial intelligence mimics emotion.

It's a complex answer with a pretty straightforward answer: no. Artificial intelligence technology cannot feel human emotions, but it does have a pretty intimate understanding of what those emotions look like, where they come from, and how they manifest in human communication.

As time goes on and artificial intelligence advances, the ability of these different forms of tech to understand emotion becomes more complex. AI isn't getting to a point where it can feel emotion. It's getting better at mimicking those emotions back at us at the right moment. (Much like a narcissistic person does).

But, do these advances in the replication of emotion change the future of AI and emotional support?

Not all it's cracked up to be.

Interestingly, emotional interactions with AI have been measured the most in retail spaces - where they have been used mostly in customer service. In 3 experiments conducted with chatbots in a customer service scenario, surprising results were discovered.

First, it was revealed that positive emotional responses from chatbots were not as effective as positive emotional responses from human customer service reps. That means that the customer on the other side of the computer didn't put a lot of emotional value on the positive interactions with what they essentially saw as a robot.

Next, they discovered that customers had specific expectations of the chatbots. The more positive a chatbot attempted to be, the more it tried to ingratiate itself to the customer, and the more turned off customers became.

What picture does this paint? Humans don't want a connection with a computer. What these results helped to reveal is that what we all essentially crave is empathy and emotional connection with others.

Even as the artificial intelligence behind chatbots becomes more refined, will they really become a substitute for a trustworthy confidante? Will they really become the cold, metallic shoulder we cry on? Or, is there no substitute for the compassion and love of another human who knows your pain?

Are tools like ChatGPT a reliable replacement for human confidantes?

Where does that leave blossoming tools like ChatGPT, which are being bandied about as potential support tools for emotionally bereft humans? If chat AI only has the ability to mimic emotion, is it really the appropriate confidante for someone in pain? Can they really connect with us on deeper levels and make us feel the empathy we need for relief?

It's a question that experts inside and outside of the mental health field are asking themselves. To get to an answer, two sides of the coin must be considered. What is the potential of tools like ChatGPT? What are their pitfalls where emotional support is concerned?

The Potential

There is major potential in terms of chat AI. On the surface level, mimicking human emotion can provide comfort. It's familiar, and sometimes that's enough. Chat AI platforms are also powerful resource tools, which can help those in emotional pain find informational sources that can help them deal with their pain. These aren't the biggest potentials to consider in chat AI, however…

An offer of flexibility: There is a high level of flexibility with chat AI technology. First, it's an "on-call" form of emotional support that doesn't have the same return needs or limitations. It doesn't need to sleep, it doesn't have an emotional bandwidth. AI is, in theory, always online and ready to do your bidding. It can also provide different forms of help for those with different needs.

Lacking judgment: AI doesn't have the same hangups that humans do (in theory). They don't judge us based on the same moral standards. This has created the idea that they will lack in judgment in their emotional support of us. According to this theory, you could tell a chatbot your deepest, darkest secrets and have no fear that it would react negatively to you.

Looking at chat AI through this lens, we can see it as a modern digital diary. It exists to provide us with basic answers, it provides us with the simulation of support. It is not, in itself, support though. It's a tool that cannot fully connect with us. We are empowered when we see it as a resource that helps us to organize our thoughts instead.

The Pitfalls

Sunny and rosy as these potential benefits may be, we have to look at the pitfalls too. ChatAI is still a relatively new technology, with only a few decades of research behind it. On a civilian level, it's only become widely available and accessible in the last half-decade. With that knowledge, we have to be willing to look at the shortcomings and the ways in which AI may fail us in the emotional realm.

  • Major misunderstanding: As rational as they may seem, computers struggle with the complex rationale of human emotion. That's especially true when dealing with those who may be mentally ill or dealing with emotional dysregulation. Artificial intelligence only understands human emotion as much as the men and women who create the code that brings it to life.
  • Limited elements: AI has been so sensationalized by the media that many think of it like the robot photographed in the image at the top of this article. That's not what AI really looks like, though. It's little more than an algorithm. There are certain elements of human connection that build a certain level of trust. Algorithms can't replicate that, and they can't replicate empathy…which is needed for true support.
  • Errors in judgment: Although we think AI is judgment free, it's not. We must remember that technology is created by man. Everything in us, our hangups and our beliefs, go into that tech. A human's distorted judgments can become warped judgment on the part of the AI being used. That can result in chaotic results for those who are looking for reliable mental and emotional support.

Imagine that a company hires a new programmer to design a core component of their AI's emotional systems. This programmer recently had a friend who lost their job. The friend, misunderstanding why the company let them go, blames it on a recent recruit who came from another country.

Unbeknownst to them, the programmer (who has listened to hours of complaints from their best friend) has picked up a subconscious bias against people from countries like the one their friend has been complaining about. As they go about their work of programming they leave this country out of a specific list they are required to upload.

This is the worm through which biases are introduced into chat AI systems. Subtle personal biases can become major judgment points and negative reactions in algorithms that are created to react based on averages.

How do we find emotional connection within technology?

Does this mean that we can't find some means of emotional connection with these vastly improved tools? No. Of course not. WHat it means is that we have to focus on some core truths moving forward. Chat AI may not be the human confidante we need at this moment, but we can remain realistic, interact with it accordingly, and learn to value human connection and empathy in the process.

Remaining realistic

We must remain realistic when it comes to artificial intelligence and what we expect from it. This is not the future the films have promised us. Sci-fi books have not completely come to life (yet). Human-like robots will not soon take over our streets, or become sentient enough to exist as our partners, our confidantes, and our friends.

We have to keep reality in sight and know the limitations of the technology we're talking about. In that mindset, we can use it accordingly (more for research resources or surface deep calming and distraction) and empower ourselves. No one should ever use a tool for emotional depths it cannot reach. In that space, we can create more effective connections.

Value human connection

Realizing that chat AI is not that emotionally intelligent should teach us one core thing: to value human connections and the benefits they provide. We are a social species and we thrive when we are able to lean on other human beings for emotional support.

There is no replacement for the value of human-to-human connection. Other humans have a level of experience that lends itself to greater empathy and understanding. Even more importantly, things aren't so black and white in the human mind.

Programming empathy

The biggest core issue with chat AI and human emotions is the lack of empathy that exists. Right now, tech doesn't have the ability to provide the level of empathy that is needed to hold a human spirit in true kindness and generosity. This tech isn't real. It doesn't have the same human experience and isn't capable of seeing the nuance of the human experience.

As we push forward and insist on making AI a part of the emotional conversation, there must be a focus on empathy. We must engrain our technology with an understanding of the frailty of the human condition. We are imperfect beings and that means, sometimes, our feelings don't immediately make sense. Empathy is still key.

---

There are still many days to come before artificial intelligence is ready to become the emotional surrogates we hope for. While AI technology is advancing every day, becoming more in tune with the human world around it, it is not yet in a place where it can fully understand or even explore the complexity of human emotion. Why? Humans are nuanced. We don't even completely understand ourselves, our motivations, or our brains (where emotions originate).

The chatbot Agony Aunts of today may be impressive, but they are limited by our own understanding of our patterns and our lives. Perhaps, as we grow closer to ourselves, and more knowledgeable of the human experience, this will change. But until then…? There is no replacement for the value of human-to-human connection. Hold tight to the love you find and value it. Comfort is found not in technology, but in the beauty of human empathy.

Li, Y., Jiang, Y., Tian, D., Hu, L., Lu, H., & Yuan, Z. (2019). AI-enabled emotion communication. IEEE Network, 33(6), 15–21.

Pantano, E., & Scarpi, D. (2022). I, robot, you, consumer: Measuring artificial intelligence types and their effect on consumers emotions in service. Journal of Service Research, 25(4), 583–600.

© E.B. Johnson 2023

tech newsfuturefact or fiction
Like

About the Creator

E.B. Johnson

E.B. Johnson is a writer, coach, and podcaster who likes to explore the line between humanity and chaos.

Reader insights

Be the first to share your insights about this piece.

How does it work?

Add your insights

Comments

There are no comments for this story

Be the first to respond and start the conversation.

Sign in to comment

    Find us on social media

    Miscellaneous links

    • Explore
    • Contact
    • Privacy Policy
    • Terms of Use
    • Support

    © 2024 Creatd, Inc. All Rights Reserved.