The setting is a busy McDonalds in downtown Boston. Much like any other day at this particular restaurant, a diverse number of emotions are being felt and expressed in various forms. An employee can’t help but become stressed as he struggles to keep up with the steady batch of orders coming in. An angry customer argues he’s been overcharged. An excited group of children celebrate their classmate’s birthday at table in the far corner. A business woman frowns at her work emails whilst grabbing a McChicken lunch.

However, unlike any other day at the restaurant, a special type of camera is watching these peoples’ every movement. It captures even the tiniest of shifts in their facial expression; each twitch, smile or raising of the eyebrow. It feeds these observations into a computer algorithm. Next, this technology interprets the facial expression and thereby detects the mood of each person.

mcdonalds

(www.lovintrends.com)

What this algorithm found in its observations at a busy McDonalds might be obvious to the most of us; staff get frustrated with unreasonable customers, mothers get angry at their screaming children and people are unhappy waiting in the queue. What is remarkable, however, is how the robot can recognise even the smallest of changes in facial expressions and how accurately it identifies the emotion behind them. Though the technology is not foolproof, and can only categorises emotions into seven broader groups (joy, sadness, anger, feat, contempt, surprise and disgust), creator Ken Denman claims it is “over 95% accurate across all the major detectors.”

The so-called ‘Emotient’ camera records at 30 frames per second, meaning it picks up on the ‘micro expressions’ even humans tend to miss. Such expressions are believed to be innate rather than learned, as they are also exhibited by blind people. The average person does not notice minuscule expression changes, meaning Emotient has the potential to become an even better reader of emotions than humans. Could the Emotient technology be used in courtrooms to determine the genuineness of a testimony? Can it reveal whether a politician is honest, whether a poker player is bluffing or whether someone is hiding sadness behind their smile?

face

(www.endtimeheadlines.org)

Emotient was recently purchased by Apple and is currently being used for market research. The technology measures people’s reactions to various adverts which are then used as feedback for marketers. Not particularly exciting.

However, a similar technology is being used elsewhere and for more interesting purposes. Pepper, the adorable humanoid robot designed by Aldebaran Robotics, reacts to people’s facial expressions and tone of voice. If Pepper’s owner expresses sadness, Pepper seeks to comfort him or her with friendliness, jokes and games. Besides being a responsive companion, Pepper assists his or her owner by providing information such as weather forecasts, or by helping calculate math problems.

Though Pepper has only been programmed to respond in specific ways to specific scenarios, the technology for robots to ‘learn’, ie program themselves through experience and repetition, is within our reach.

Moving robots forward into an age where they can develop preferences, and know how to express those preferences in a manner which mimics emotions, has profound implications. At what point will we be forced to consider the tricky ethical questions of what constitutes robot consciousness or even self-awareness?

What rights would sentient robots have? Robots may only be interpreting emotions today, but we could well find that the near future will have them experiencing emotions too.

robot

(www.shutterstock.com)]