What is Affective Computing And How Could Emotional Machines Change Our Lives?
2 July 2021
How would your computer respond if you looked frustrated or upset? Could your phone comfort you if you were sad after getting a call? Could your smart home adjust the music, lighting, or other aspects of the environment around you after you’ve had a bad day at work — without being asked?

It may seem far-fetched, but computers that can read your emotions and have some level of “emotional intelligence” are not far off. The field is called affective computing, and it’s being developed for use in many applications.
Affective computing is not a new field but one that is becoming more relevant today, especially if you combine them with big data, robotics and machine learning.
Why do we want a computer to empathize with us?
Emotions are a fundamental part of the human experience — but they’ve long been ignored by technology development because they seemed difficult to quantify and because the technology didn’t really exist to read them. This has resulted in sometimes frustrating user experiences.
If you’ve ever gotten angry at the “helpful” wizards in a computer help program, you know what I’m talking about.
Programs are being developed that can use facial expressions and micro-expressions, posture, gestures, tone of voice, speech and even the rhythm or force of keystrokes as well as the temperature of your hands to register changes in a user’s emotional state. Cameras and other sensors send the input data to deep learning algorithms that determine what your emotional state might be — and then react accordingly.
And the applications of these tools are practically limitless. E-learning programs could automatically detect when the learner was having difficulty and offer additional explanations or information. E-therapy could help deliver psychological health services online and be as effective as in-person counseling.
Companies including the BBC, CBS, Coca-Cola and Disney are already partnering with a leading company in recognizing facial expressions, Affectiva, to test the effectiveness of advertisements, and how viewers react to film trailers and TV shows.
The company is now working with “a very large Japanese car company” to create in-car technology that can sense when you’re drowsy or distracted, and can contact emergency services or a friend or family member in an emergency situation.
Microsoft even recently tested a bra that can sense stress levels and warn women not to overeat!
Other applications are being created to help people on the Autism spectrum interact with others. People with Autism typically have difficulty recognizing the emotions of others, and small, wearable devices can help alert them to another person’s emotions to help them react and interact in social situations.
Another medical device can alert the wearer to changes in their biometric data (heart rate, temperature, etc.) in the moments before, during, and after a dangerous epileptic seizure.
Additional labs are working on devices that can sense everything from pain to depression — which can be difficult to accurately diagnose. The idea is that it would replace the subjective “pain scale” that asks patients to rate their own pain level from 1–10 (and obviously offering problems when one person’s 5 is another person’s 8), and even programs that could monitor movements associated with chronic pain and provide suggestions for physiotherapy to relieve it.
The age of emotional machines is coming
Just as “artificial intelligence” isn’t the same as human intelligence (computers “think” in fundamentally different ways than the human brain), so to “emotional” machines won’t really be emotional.
But by combining affective computing with machine learning, big data, and robotics, you’re on the edge of a time when machines will at least seem to respond to us with sympathy and other emotional responses.
Your refrigerator might suggest you skip the ice cream tonight because your stress levels are high. Your car might warn you to drive carefully because you seem tense. Your phone might encourage you to take breaks because you’re getting frustrated.
Robots already exist that can recognize the faces of different family members and respond accordingly. Soon they will be able to recognize your emotions as well and offer helpful suggestions.
And the age of the “emotional” computer will be upon us.
Related Articles
The Best Smart Watches In 2023 / 2024: From Blood Sugar Monitoring To AI Personal Training
Apple popularized the smartwatch, just as it did with the smartphone when it released the Apple watch in 2015.[...]
The Amazing Ways Snowflake Uses Generative AI For Synthetic Data And Natural Language Queries
You probably know that the new generation of generative AI tools that have exploded onto the scene can generate words, pictures and even videos that closely resemble those created by humans.[...]
The Role of Data Storage in Accelerating Time-to-Insights
When it comes to data and analytics, time is money. According to research by IDC, 75 percent of business decision-makers believe data loses its value within days.[...]
The Top 5 Tech Trends In 2024 Everyone Must Be Ready For
Once again, we’ve reached the time of year when we look ahead at what technology has in store for us in 2024.[...]
The Amazing Ways Coca-Cola Uses Generative AI In Art And Advertising
Some say that in the very near future, we’ll need to either adopt artificial intelligence (AI) or be made redundant by it – or by others using it.[...]
The 5 Biggest Risks of Generative AI: Steering the Behemoth Responsibly
In our contemporary world, the pressures of the professional sphere often encroach upon our personal space, giving rise to stress and an overwhelming sense of dread.[...]
Stay up-to-date
- Get updates straight to your inbox
- Join my 1 million newsletter subscribers
- Never miss any new content
Social Media