clock menu more-arrow no yes

How Do You Feel About This Tomato? Emotient Can Tell.

"With women, their response is different. You’ll look at that red curve, the top line is anger, followed by disgust."

The new Pixar film “Inside Out” posits that everyone’s brain is piloted by a team of emotions. When a character needs to be cheered up, one of her emotions walks up to a console, pushes a button and — Joy! She smiles.

Okay, so maybe Pixar embellished the science a tad. But a machine-learning startup called Emotient is working to detect and analyze how people are feeling, even when they don’t know it.

Emotient’s software reads the expressions of either individuals or crowds of people and then crunches what it sees into data for clients, such as advertisers. The company recently shared with Re/code what it found when it tested this racy Carl’s Jr. ad via 270 over-the-Web screenings.

“With the men, the joy is starting to rise with the suggestive tomato, followed by the melons,” said lead scientist Marni Bartlett. “With women, their response is different. You’ll look at that red curve, the top line is anger, followed by disgust.”

https://www.youtube.com/watch?v=4WTA_8waxTo

Bartlett was indicating two graphs showing how frequent certain emotions became over the course of the ad’s 52 seconds. Most men’s faces were indicating “joy” at the moment Charlotte McKinney bites into the hamburger, while most women were expressing “anger” — both of which, Emotient CEO Ken Denman said, could have indicated success.

“Any advertisement can be a good advertisement, even if the reaction is negative,” he said. “We’re trying to predict memorability and likability. Memorability is a really big deal in the ad space.”

The company also boasts that it can read expressions “at scale.” In other words, it can record video of a crowd, process their emotions frame by frame, and after a few hours of processing graph exactly how that crowd was feeling over time.

At a recent Golden State Warriors game, it used an $800 1080p camera to record from 300 feet across the stadium; at a set zoom level, the camera could distinguish 70 to 100 faces, while a better 4k camera could have captured closer to 400, Denman said. He disclaimed that signs at the stadium told people they could be video-recorded, and that the company does no facial recognition.

Bartlett said women preferred the Warriors’ Jr. Jam Squad to the cheerleaders, while only men were interested in the testosterone-y “get ready for battle!” sequence that played before the game began. While that may sound obvious in hindsight, the company’s pitch is that advertisers might care more about when their messages are playing if they’re following something that turned off half the audience.

Emotient also hopes to apply its technology to medical uses, added Bartlett, who is also a research professor at UC San Diego. Doctors could more quickly tell if their patients are in pain, or children with autism might play a game that tests their ability to correlate the word “happy” with making a happy face.

“There’s a high incidence for people with pain who don’t know [how] or don’t want to ask for help,” she said. “Kids are scared to speak up to nurses, or adults are unwilling for various social reasons. This provides a more continuous, standardized measure of pain.”

If you found yourself in pain after watching that hamburger ad, though, there’s probably not much a doctor can do for you.

This article originally appeared on Recode.net.

Sign up for the newsletter The Weeds

Understand how policy impacts people. Delivered Fridays.