By Jeffrey Kluger
March 19, 2015

If you’re trying to tell a lie or keep a secret, your face is not your friend. The human face may have been built for certain basic functions–eating, breathing, seeing–but the 43 separate muscles that keep it constantly moving mean it is constantly communicating too. Every eyebrow lift, forehead furrow, mouth twitch means something. That’s bad news if you’re bluffing, but it’s good for a growing small-business sector that uses facial analysis to figure out if an ad campaign or a TV pilot is landing with consumers.

Affectiva, a 30-person operation in Waltham, Mass., is the most visible of these companies. The six-year-old firm has amassed 1,400 clients, including Unilever, Kellogg’s and CBS. In the age of precise online and mobile metrics, most marketing chiefs are tiring of squishy focus-group and consumer-poll results; they want hard data. Rana el Kaliouby, Affectiva’s chief science officer and co-founder, wants to provide it to them.

A decade ago, el Kaliouby, who has a computer-science Ph.D. from Cambridge University with postdoctoral studies at MIT, began collecting video samples of faces with the goal of helping autistic children. “Autistic kids have a hard time reading faces,” she says, “so the plan was to design a system that tells them that the person they’re talking to is smiling, say, or looks confused, so maybe they want to explain themselves.”

In 2006, a grant from the National Science Foundation brought her to the MIT Media Lab to continue her work. Industry groups regularly visit the lab in the hope of discovering new technology, and el Kaliouby’s research intrigued them. “They asked, ‘Have you thought of applying it to Procter & Gamble or Fox testing a product or TV lineup?'” she recalls. In 2009 she and Rosalind Picard, her MIT professor, spun out Affectiva to do just that.

For a starting fee of $2,500–which climbs depending on whether a 30-second commercial or a one-hour pilot is being tested–Affectiva makes its software available to marketers. Subjects watch a video on a computer screen while the pinhole camera in the computer watches them back. Volunteers always know when they’re being recorded, which doesn’t materially affect the results. Engagement, boredom, amusement, displeasure and more are tracked and analyzed, with changing degrees of each displayed with real-time fever charts. (The venture-backed company is not yet profitable.)

The database Affectiva uses to conduct those analyses is made up of more than 2.5 million facial video samples, each of which runs for 45 seconds at a rate of 14 frames per second. “We have 7 billion emotional data points [to use for comparison],” says el Kaliouby. The software corrects for variables including gender, culture and age, all of which can be important. “Women tend to smile more than men,” El Kaliouby says, “and they smile longer too. Older people tend to be more expressive than younger people.” Europeans and Americans give away more than Asians do, she adds.

This method of data collection has proved popular. Startup nViso, in Switzerland, employs similar technology as Affectiva. And Emotient, based in San Diego, collects its data “in the wild,” as CEO Ken Denman puts it, by using software to study groups of people–shoppers in malls or crowds in arenas–to see how they’re reacting to what they’re seeing.

Market testing is only the lowest-hanging fruit. El Kaliouby envisions diversifying into political polling and analysis, as well as helping teachers of online courses assess student engagement. Autism and other cognitive and psychological conditions remain on her radar.

There are some potential growth areas that are more controversial: law enforcement, lie detection and airport security, for example. For both Emotient and Affectiva they’re no-go zones. “When we first started,” says el Kaliouby, “we articulated our values for the company and determined that subjects would always have to opt in, so for that reason we don’t want to be in security.” That, of course, leaves that space open to new competitors.

Write to Jeffrey Kluger at jeffrey.kluger@time.com.

This appears in the March 30, 2015 issue of TIME.

SPONSORED FINANCIAL CONTENT

Read More From TIME

EDIT POST