Prev NEXT  


How could Google Glass detect people's emotions?

Some Good Face Time

While Google Glass has been the primary showcase hardware for SHORE, the app can potentially work on computers, tablets and smartphones as well.
While Google Glass has been the primary showcase hardware for SHORE, the app can potentially work on computers, tablets and smartphones as well.
Stephen Lam/Getty Images

All of SHORE's calculations happen on the local device. And it all starts with face detection. SHORE detects faces correctly approximately 91.5 percent of the time. It recognizes whether your face is forward, rotated or pointed to one side in profile. It tracks the movement of your eyes, nose, lips and other facial features and checks the position of each against its face database. The system works best when your subject is facing you and within 6 feet (2 meters) or so.

The software uses tried-and-true algorithms to deduce the emotional state of each target. Because it works so quickly, it can even pick up on microexpressions, those flickers of facial expressions that last for a fraction of second and betray even people who are excellent at controlling their body language.


SHORE is best at reading really obvious emotions. People who are truly happy not only have big, toothy grins – they also smile at the eyes. People who are shocked typically have the same wide-eyed, wide-mouthed reactions. SHORE picks up on those cues easily but isn't quite as accurate when it comes to other emotions, like sadness or anger.

It's easy to be a little (or a lot) freaked out by SHORE. If a piece of software can accurately detect your mood, age and gender, why can't it identify you by name? The answer is, well, it probably could. Governments and companies have been using facial recognition technologies for years now to spot terrorists and criminals.

But SHORE doesn't share the images it captures. It doesn't even need a network connection to perform its magic. Instead, it simply uses the Glass's onboard CPU to do its work. That means none of the images go into the cloud or online where they could be used for either good or nefarious purposes. It also means SHORE won't identify people, which in theory, alleviates a major concern about privacy.

Although Fraunhofer chose to showcase SHORE on Google Glass, other devices can use SHORE. Any computer with a simple camera, such as a smartphone or tablet, may eventually be able to install SHORE for emotion detection purposes.

Because of its partnership with Google, SHORE may be one of the most visible emotion detection applications, but it certainly isn't the only one. Many other companies see value in automated emotion identification, and because there's a lot of room for improvement, the competition is tough. Every company has to deal with challenges like poor lighting, CPU processing speeds, device battery life and similar restrictions.

But just as voice recognition software has improved by immeasurably in recent years, you can expect that emotion recognition will improve, too. The emotional computing age is coming. Let's hope our machines handle all of our human frailty with the concern and caring that we deserve.

Author's Note: How could Google Glass detect people's emotions?

As emotion recognition applications advance, software of all kinds will transform in weird and wonderful ways. Your smartphone will see you when you're sad and display pictures of funny pictures of puppies and kittens. When you're angry, it will show soothing pictures of nature and automatically dial your therapist. When you look a bit tipsy it will call out for pizza so that you don't drive to pick it up yourself. Or maybe your smart device will actually be rather tone deaf to your needs, a lot like your ex. Either way, you can bet that clever programmers will find all sorts of ways to integrate emotion detection into upcoming apps, for better or worse.

Related Articles


  • Anthony, Sebastian. "Real-Time Emotion Detection with Google Glass an Awesome, Creepy Taste of the Future of Wearable Comptuers." ExtremeTech. Sept. 4, 2014. (Oct. 2, 2014)
  • Diep, Francie. "A Google Glass App that Detects People's Emotions." Popular Science. Sept. 2, 2014. (Oct. 2, 2014)
  • Dutton, Jack. "This App Lets People Wearing Google Glass See Your True Emotions." Business Insider. Sept. 1, 2014. (Oct. 2, 2014)
  • Fraunhofer press release. "A Visionary World Premier: Fraunhofer IIS Presents World's First Emotion Detection App on Google Glass." Aug. 27, 2014. (Oct. 2, 2014)
  • Fraunhofer corporate site. "SHORE Software Solutions for Various Applications." (Oct. 2, 2014)
  • Lavars, Nick. "Fraunhofer's Google Glass App Detects Human Emotions in Real Time." Gizmag. Aug. 27, 2014. (Oct. 2, 2014)
  • O'Neil, Lauren. "Could a Google Glass App That Detects Human Emotion Help Those with Autism?" CBC News. Sept. 2, 2014. (Oct. 2, 2014) an-emotion-help-those-with-autism.html
  • Thompson, Jeff. "Is Nonverbal Communication a Numbers Game?" Beyond Words in Psychology Today. Sept. 30, 2011. (Oct. 2, 2014)
  • Truong, Alice. "This Google Glass App will Detect Your Emotions, Then Relay Them Back to Retailers." Fast Company. March 6, 2014. (Oct. 2, 2014)