Like HowStuffWorks on Facebook!

How could Google Glass detect people's emotions?


Some Good Face Time
While Google Glass has been the primary showcase hardware for SHORE, the app can potentially work on computers, tablets and smartphones as well.
While Google Glass has been the primary showcase hardware for SHORE, the app can potentially work on computers, tablets and smartphones as well.
Stephen Lam/Getty Images

All of SHORE's calculations happen on the local device. And it all starts with face detection. SHORE detects faces correctly approximately 91.5 percent of the time. It recognizes whether your face is forward, rotated or pointed to one side in profile. It tracks the movement of your eyes, nose, lips and other facial features and checks the position of each against its face database. The system works best when your subject is facing you and within 6 feet (2 meters) or so.

The software uses tried-and-true algorithms to deduce the emotional state of each target. Because it works so quickly, it can even pick up on microexpressions, those flickers of facial expressions that last for a fraction of second and betray even people who are excellent at controlling their body language.

SHORE is best at reading really obvious emotions. People who are truly happy not only have big, toothy grins – they also smile at the eyes. People who are shocked typically have the same wide-eyed, wide-mouthed reactions. SHORE picks up on those cues easily but isn't quite as accurate when it comes to other emotions, like sadness or anger.

It's easy to be a little (or a lot) freaked out by SHORE. If a piece of software can accurately detect your mood, age and gender, why can't it identify you by name? The answer is, well, it probably could. Governments and companies have been using facial recognition technologies for years now to spot terrorists and criminals.

But SHORE doesn't share the images it captures. It doesn't even need a network connection to perform its magic. Instead, it simply uses the Glass's onboard CPU to do its work. That means none of the images go into the cloud or online where they could be used for either good or nefarious purposes. It also means SHORE won't identify people, which in theory, alleviates a major concern about privacy.

Although Fraunhofer chose to showcase SHORE on Google Glass, other devices can use SHORE. Any computer with a simple camera, such as a smartphone or tablet, may eventually be able to install SHORE for emotion detection purposes.

Because of its partnership with Google, SHORE may be one of the most visible emotion detection applications, but it certainly isn't the only one. Many other companies see value in automated emotion identification, and because there's a lot of room for improvement, the competition is tough. Every company has to deal with challenges like poor lighting, CPU processing speeds, device battery life and similar restrictions.

But just as voice recognition software has improved by immeasurably in recent years, you can expect that emotion recognition will improve, too. The emotional computing age is coming. Let's hope our machines handle all of our human frailty with the concern and caring that we deserve.