Written by

Bernard Marr

Bernard Marr is a world-renowned futurist, influencer and thought leader in the fields of business and technology, with a passion for using technology for the good of humanity. He is a best-selling author of 20 books, writes a regular column for Forbes and advises and coaches many of the world’s best-known organisations. He has over 2 million social media followers, 1 million newsletter subscribers and was ranked by LinkedIn as one of the top 5 business influencers in the world and the No 1 influencer in the UK.

Bernard’s latest book is ‘Business Trends in Practice: The 25+ Trends That Are Redefining Organisations’

View Latest Book

When Machines Know How You’re Feeling: The Rise Of Affective Computing

2 July 2021

The clinical, emotionless computer or robot is a staple of science fiction, but science fact is starting to change: computers are getting much better at understanding emotions.

As we turn to computers, smart devices and robots to do more and more functions that have always been the exclusive domain of humans, this emotion-detecting technology will become increasingly important.  Automated customer service “bots” will be better able to know if a customer is getting the help they need. Robot caregivers involved with telemedicine may be able to detect pain or depression even if the patient doesn’t explicitly talk about it. One insurance company I am working with is even experimenting with call voice analytics that can detect that someone is telling lies to their claims handers.

IBM’s Watson has developed a ‘Tone Analyzer’ that can detect sarcasm and a multitude of other emotions from your writing. It also has an Emotion Analysis API, to help users understand the emotion of people they’re chatting with.

These sorts of advancements are important for computers and robots that hope to seamlessly interact with humans. They may not yet pass the Turing test, but recognizing emotions gets them a step closer.

Affective computing

This particular branch of computer science is known as affective computing, that is, the study and development of systems and devices that can recognize, interpret, process, and simulate human experiences, feelings or emotions.

But it’s also related to deep learning, because complicated algorithms are required for the computer to perform facial recognition, detect emotional speech, recognize body gestures, and other data points. The computer compares the data input — in this case, a human with whom it is interacting — to its learning database to make a judgement about the person’s emotions.

A University of Ohio team programed a computer to recognize 21 ‘compound’ emotions including happily surprised and sadly disgusted. And, in tests, the computer was more accurate at recognizing these subtle emotions than the human subjects were.

That’s because the computer’s pattern recognition capabilities are superior to a human’s, and because many people tend to use the same facial muscle movements to indicate the same emotions.

Potential applications

Other than avoiding a HAL 9000 scenario, the potential for these applications is enormous.

  • In e-learning situations, the presentation method can be customized to suit the user, preventing them from getting bored or slowing down when they are confused.
  • Digital pets and companions, like this Japanese robot, will become more common and more lifelike.
  • Psychological health services could benefit from programs that can recognize a patient’s emotional state.
  • Companies could use the technology to conduct market research, analyzing a product tester’s actual emotions rather than simply their statements.
  • The same could be done to judge the impact of advertising or political speeches and statements.
  • Security companies could use the technology to identify individuals in crowds who seem nervous as potential threats.
  • Your computer might even be able to warn you to pause before you send an angry email, change the music track to fit your mood, or disable your car if you’re in an emotionally volatile state.
  • The technology is also being used to help people with autism and other disabilities to interact with the rest of the world.

Clearly this technology could have many potential benefits. But, as with any technological advancements, there could also be pitfalls. Woe to the person who seems nervous in an airport when he or she is simply running late. And don’t let your computer catch you making angry or mocking expressions just after a meeting with your boss.

(If you’re not sure how you feel about this, why not check out one of the many emotion recognition apps that will tell you how you’re feeling.)

Data Strategy Book | Bernard Marr

Related Articles

Why The Internet Of Medical Things (IoMT) Will Start To Transform Healthcare

Even though the healthcare industry has been slower to adopt […]

How To Write A Resume To Appeal To Robot Recruiters

Did you know that recruiters spend only 6 seconds reviewing […]

3 (Not So Obvious) Industries That Could Be Transformed By Computer Vision

Computer vision technology (also known as machine vision) allows machines […]

6 Amazing Passenger Drone Projects Everyone Should Know About

Large cities with traffic congestion problems such as Los Angeles, […]

Robot-Powered Pizza, Anyone? How Automation Is Transforming The Fast-Food Industry

Can the restaurant industry really be automated? You may have […]

Stay up-to-date

  • Get updates straight to your inbox
  • Join my 1 million newsletter subscribers
  • Never miss any new content

Social Media



View Podcasts