Written by

Bernard Marr

Bernard Marr is a world-renowned futurist, influencer and thought leader in the fields of business and technology, with a passion for using technology for the good of humanity. He is a best-selling author of 20 books, writes a regular column for Forbes and advises and coaches many of the world’s best-known organisations. He has over 2 million social media followers, 1 million newsletter subscribers and was ranked by LinkedIn as one of the top 5 business influencers in the world and the No 1 influencer in the UK.

Bernard’s latest book is ‘Business Trends in Practice: The 25+ Trends That Are Redefining Organisations’

View Latest Book

What is Affective Computing And How Could Emotional Machines Change Our Lives?

2 July 2021

How would your computer respond if you looked frustrated or upset?  Could your phone comfort you if you were sad after getting a call?  Could your smart home adjust the music, lighting, or other aspects of the environment around you after you’ve had a bad day at work — without being asked?

It may seem far-fetched, but computers that can read your emotions and have some level of “emotional intelligence” are not far off.  The field is called affective computing, and it’s being developed for use in many applications.

Affective computing is not a new field but one that is becoming more relevant today, especially if you combine them with big data, robotics and machine learning.

Why do we want a computer to empathize with us?

Emotions are a fundamental part of the human experience — but they’ve long been ignored by technology development because they seemed difficult to quantify and because the technology didn’t really exist to read them. This has resulted in sometimes frustrating user experiences.

If you’ve ever gotten angry at the “helpful” wizards in a computer help program, you know what I’m talking about.

Programs are being developed that can use facial expressions and micro-expressions, posture, gestures, tone of voice, speech and even the rhythm or force of keystrokes as well as the temperature of your hands to register changes in a user’s emotional state. Cameras and other sensors send the input data to deep learning algorithms that determine what your emotional state might be — and then react accordingly.

And the applications of these tools are practically limitless. E-learning programs could automatically detect when the learner was having difficulty and offer additional explanations or information. E-therapy could help deliver psychological health services online and be as effective as in-person counseling.

Companies including the BBC, CBS, Coca-Cola and Disney are already partnering with a leading company in recognizing facial expressions, Affectiva, to test the effectiveness of advertisements, and how viewers react to film trailers and TV shows.

The company is now working with “a very large Japanese car company” to create in-car technology that can sense when you’re drowsy or distracted, and can contact emergency services or a friend or family member in an emergency situation.

Microsoft even recently tested a bra that can sense stress levels and warn women not to overeat!

Other applications are being created to help people on the Autism spectrum interact with others. People with Autism typically have difficulty recognizing the emotions of others, and small, wearable devices can help alert them to another person’s emotions to help them react and interact in social situations.

Another medical device can alert the wearer to changes in their biometric data (heart rate, temperature, etc.) in the moments before, during, and after a dangerous epileptic seizure.

Additional labs are working on devices that can sense everything from pain to depression — which can be difficult to accurately diagnose.  The idea is that it would replace the subjective “pain scale” that asks patients to rate their own pain level from 1–10 (and obviously offering problems when one person’s 5 is another person’s 8), and even programs that could monitor movements associated with chronic pain and provide suggestions for physiotherapy to relieve it.

The age of emotional machines is coming

Just as “artificial intelligence” isn’t the same as human intelligence (computers “think” in fundamentally different ways than the human brain), so to “emotional” machines won’t really be emotional.

But by combining affective computing with machine learning, big data, and robotics, you’re on the edge of a time when machines will at least seem to respond to us with sympathy and other emotional responses.

Your refrigerator might suggest you skip the ice cream tonight because your stress levels are high. Your car might warn you to drive carefully because you seem tense.  Your phone might encourage you to take breaks because you’re getting frustrated.

Robots already exist that can recognize the faces of different family members and respond accordingly. Soon they will be able to recognize your emotions as well and offer helpful suggestions.

And the age of the “emotional” computer will be upon us.

Data Strategy Book | Bernard Marr

Related Articles

Why Everyone Must Get Ready For 4th Industrial Revolution

First came steam and water power; then electricity and assembly […]

10 Amazing Examples of Robotic Process Automation in Practice

With constant pressure to increase efficiencies and reduce costs, many […]

Why Should Businesses Adopt Industry 4.0 Technologies?

The simple answer to “Why should businesses adopt Industry 4.0 […]

Internet Of Things (IoT): 5 Essential Ways Every Company Should Use It

The Internet of Things (IoT) means everyday objects are now […]

The Best Must-Have Smart Home Gadgets Available Today

Have you ever been grocery shopping only to find yourself […]

Stay up-to-date

  • Get updates straight to your inbox
  • Join my 1 million newsletter subscribers
  • Never miss any new content

Social Media



View Podcasts