How could devices that understand our emotions change our world?
Speech to text that can properly annotate sarcasm, Facial recognition that can detect sadness or anger, artificial intelligence powered therapists. The possibilities are quite endless in a world with affective computing. Even with years of development still needed we can easily imagine the world with affective computing at play.
Affective computing sounds like something out of the Terminator franchise but it is far from a fictional world of a world dominated by machines, it is more closely like a future of robots programmed to hug you when you’re feeling blue.
But how do you input emotions into a machine without truly understanding emotions? Well Paul Ekman, an american psychologist and professor at the University of California, performed a cross cultural study to test the hypothesis of universal emotions. He categorized emotions in 6 basic ways. Anger, Disgust, Fear, Happiness, Sadness, Surprise are the emotions at their most basic level. He later expanded his list to include Amusement, Contempt, Contentment, Embarrassment, Excitement, Guilt, Pride in Achievement, Relief, Satisfaction, Sensory Pleasure and Shame.
With a proper understanding of how emotions can be seen universally, the idea of a program to comprehend them itself doesn't sound too outlandish. Four companies are basically leading the charge with Affectiva, NVISO, Cogito, and Sightcorp on the front lines of this emerging but also quickly growing industry.
The pros to affective computing are vast such as the ability to identify mental health issues, the ability to create sentient life, it could be possible to create a smart home that is entirely reliant on specific gestures and affective computing will also be a major stepping stone in the creation of Artificial intelligence.
The cons to affective computing are unfortunately also a side effect of anyone having access to such a vast concept. The ability to see your reaction to advertisements would enable companies to want to survey people's emotions and, based on that data, give you targeted advertisements. Hackers could take advantage of this and recognize facial patterns to make deep fakes of anyone's facial patterns. Many will protest the idea of Artificial Intelligence based on the stigma clouded around the topic due to popular culture portraying the subject in such a negative light.
Speech and gestures are the best way to see a mood without explicitly being told the mood itself. We can see patterns in speech shift by understanding the average speech rate, accent, stress frequency, loudness and pitch. Our gestures are also a tell tale sign of our moods along with posture by showing things like lifting of the shoulders to display not knowing the answer to a question or simple things like waving or clapping to show basic emotions.
Heart rate monitors can be used to check a patient's fear or calmness level. Facial recognition can detect pain or anguish. The medical field would benefit greatly from being able to truly understand what a patient is feeling even when the patient cannot properly describe their issue. A.I could be hypothetically trained to perform surgeries to remove the human error from the equation of life saving operations to effectively lower the amount of people who die yearly from medical errors, which stands at a rate of 251,000 annually.
Communicative technology could be utilized to aid in learning and with autistic patients. Allowing students to get their own affective technology will allow teachers to get proper responses of how students are reacting to their assigned work which is especially helpful in online or distance learning. Affective games have also been used in the medical field to study behavior and psychological well being.
As science propels us further and further into the future it is amazing yet terrifying to see what our tech filled future could hold for the human race.