Futurism logo

How AI Emotion Recognition Can Possibly Cause Harm

by Grace Tennet 8 months ago in artificial intelligence

Online games created by scientists

How AI Emotion Recognition Can Possibly Cause Harm
Photo by Possessed Photography on Unsplash

New technologies emerge almost every day and sometimes we can’t keep up with them. Unfortunately, as technologies evolve, the gap between developers and ethicists increases. When do convenience and usefulness turn into human rights violations?

The industry of artificial intelligence prospers - nine out of ten leading companies already invest in AI trying to increase conversion and boost sales. That is why we, as consumers, should be aware of the dangers that the latest technologies may withhold. In this post, we want to discuss how harmful AI emotion recognition can be not depending on whether you are shopping online or communicating with a chatbot.

Is Emotion Recognition That Innocent?

Emotion recognition, as you get from the name, is a process of identifying emotions. The main sources of data are video and audio materials, texts, and physiology indicators from wearables. The technology is applied in numerous fields: to understand what people think, help children with autism to read facial expressions, make machine-human interaction more organic, and much more. In 2015, Snapchat has even patented a method that collects data about crowds and their emotions using selfies. However, emotion recognition has raised concerns among researchers and ethicists and they want to start a public discussion on whether the technology is as harmless as it seems at first glance.

A team of experts has created Emojify, a website where anyone can test emotion recognition by using their computer cameras. There is a game where you can try and trick the software and a game, where you can test the technology in context. One of the most popular games is to make fake emotions and see if the system will find out the truth. The aim of the researches is to raise awareness of emotion recognition and its use.

The Main Criticism

Dr. Alexa Hagerty, a researcher at the University of Cambridge, says that facial recognition goes beyond simple identification because it reads our inner emotions. Potentially, the technology may harm the freedom of expression or even cause police discrimination. According to Hagerty, one of the biggest problems is that people are not aware of how these technologies work and that they have penetrated into many spheres of our daily lives: from airport security to hiring procedures.

Emotion recognition is used all over the world. For example, a Shenzhen company Taigusys uses the technology in prisons and care homes. While Lucknow, a city in India, is planning to apply emotional recognition to detect women who were assaulted and harassed. Alexa Hagerty admits positive aspects of the technology but she is sure that the drawbacks, such as racial bias, may outweigh the benefits.

Public discussions may be a good way out. That is what Emojify is here for. It allows trying emotion recognition and getting familiar with the technology. The site doesn’t collect personal data including the photos you have made.

People behind the technology claim that it reads emotions. However, in reality, it only reads facial movements, compares them with the database, and makes predictions. For example, when you smile, technology says that you are happy. But is that really so? Enough researches have been done to show that the technology is too plain. It is possible to fake smiles, anger, or depression without feeling these emotions. Researchers of the technology say that they know about these limitations but it is not enough.

Other Opinions

The Emojify project is funded by Nesta, a UK governmental agency for social good, and aims to raise awareness of emotion recognition and to boost public debate. More and more people are already starting to realize that we are no longer side consumers of technologies. We are actually living INSIDE the technologies every time when playing slot games, making online purchases, using search engines, or doing other kinds of operations.

Vidushi Marda, a programmer at Article 19, an organization that protects human rights, is sure that the market of emotion recognition technologies should be paused. She believes that the software is not only discriminatory but violates human rights and one of the best decisions that we can all make is to discuss the topic, research the emotion recognition harm, and engage governmental authorities and international organizations in the problem.

artificial intelligence

Grace Tennet

Kia ora! I'm Grace, a content manager from New Zealand. I'm a big fan of online gambling and want to share my knowledge with others on this page!

Receive stories by Grace Tennet in your feed
Grace Tennet
Read next: The Cœur Locket®

Find us on social media

Miscellaneous links

  • Explore
  • Contact
  • Privacy Policy
  • Terms of Use
  • Support

© 2021 Creatd, Inc. All Rights Reserved.