Like HowStuffWorks on Facebook!

How could Google Glass detect people's emotions?


SHORE is designed to interpret a wide range of visual cues.
SHORE is designed to interpret a wide range of visual cues.
© Fraunhofer IIS/Jens Garbas

We humans mask our intentions with lies, misdirection and misinformation. But one of the most telling aspects of interpersonal communication isn't words. It's body language. Some researchers say that more than half of our communication happens through body language; tone of voice and spoken words were a distant second and third, respectively [source: Thompson].

These days, it's not just people reading body language. Machines are picking up on those nonverbal cues, too, to the point where some can even read our emotions.

Take the SHORE Human Emotion Detector, which is an app (or "glassware") for Google Glass, a wearable computer from Google. A German organization called the Fraunhofer Institute initially created SHORE for object recognition. SHORE stands for Sophisticated High-Speed Object Recognition Engine.

To a computer, your face is ultimately just another object, albeit one with all sorts of unique contours and shifting topography. When performing its calculations, all SHORE needs is a simple digital camera like the one found on Google Glass. At around 10 frames per second, it analyzes incoming image data and compares it against a database of 10,000 faces that were used to calibrate the software.

Using those comparisons, along with on-the-fly measurements of your face, SHORE can make a pretty good guess as to whether you're happy, sad, surprised or angry. About 94 percent of the time, SHORE knows if you're male or female. It'll take a stab guessing your age, too.

The Google Glass display can provide a continuous feed of visual updates with all of the data SHORE produces, and if you want, audio cues are available as well. What you do with these insights is up to you. Maybe that guy really is into you. Or maybe he's the geeky type who really just wants to get his hands on your Google Glass.

Kidding aside, Fraunhofer wants consumers and companies to understand that there are some serious uses for SHORE. People with conditions such as autism and Asperger's often struggle to interpret emotional cues from others. Real-time feedback from software like SHORE may help them fine-tune their own emotional toolbox to better understand interpersonal give and take.

Car makers could integrate SHORE into their vehicles to detect driver drowsiness. In this application, an alarm would awaken drivers in danger of drifting off at the wheel.

Medical personnel could use SHORE to better identify physical pain in patients. SHORE may even detect psychological distress like depression, which is notoriously difficult to spot in many people. In assisted living situations, SHORE could keep a tireless eye on patients to ensure that they're safe.

And of course, there's a money-making side to SHORE. Marketing companies of all kinds can deploy this app to judge reactions of consumers to, say, a product commercial or movie trailer, and thereby get a better idea of how effective their advertising campaign might be.


More to Explore