Sure, it’s convenient to have voice-controlled bots that can turn on our music or dictate emails, but the real dream for artificial intelligence has always been to create an emotionally intelligent machine. Only when A.I. tech can accurately read human emotions will we truly see the full potential for this technology to change our lives in fundamental ways.
We may be on the cusp of major A.I. breakthroughs, but culturally we have been fascinated by its possibilities at least since 1872, when Samuel Butler published the satire Erewhon to popular acclaim. “There is no security against the ultimate development of mechanical consciousness,” he wrote. And while A.I. has functioned as a sci-fi trope for good versus evil and human versus machine ever since, it’s only become more relevant as our technology has become more sophisticated. The ethics of A.I. are beginning to pose real and important questions now that our technology is catching up to our imagination.
It’s this convergence of the real and the impossible that makes A.I. so gripping. Take the instant popularity of the new HBO hit Westworld, which poses the creepy questions we’ll soon have to ask ourselves in real life. Questions like, can we empathize with robots? What nuances make a robot seem human? And have we reached the ultimate development of mechanical consciousness that Samuel Butler warned about more than a century ago?
In many ways, A.I. technology is still catching up to the sci-fi realities presented in shows like Westworld. But thanks to the new convergence of A.I. and emotion recognition software, we are closer than ever before. These three startups are on the forefront of bringing emotionally controlled machines to life, stepping irrevocably across a line that has always separated fiction and reality.
By combining emotion research with big data and machine learning, Affectiva emerged from MIT’s Media Lab to lead the field in making software that can accurately read a wide range of facial expressions. Affectiva claims to have the world’s largest emotion database to date, with more than 4.5 million faces across 75 countries analyzed. Of more than 30 patents filed to date, Affectiva has secured seven so far.
Think about it: virtually any app on your phone could be enhanced by Affectiva’s analytics, from dating apps that could read your interest not by a swipe but a smile, to games that self-adjust based on the player’s level of engagement. Thanks to Affectiva, we’ll become even more addicted to our phones. The company is also working on software that can read emotion through voice, which would have a host of practical applications like better customer service interfaces. For now the technology is still making the rounds on the tech conference circuit, but we can expect to see Affectiva to partner with advertisers, app developers, and more. After all, it’s all about the user experience and creating technology that feels even more personalized and indispensable.
Lightwave is an “applied neuroscience platform” that promises golden data to marketers keen to understand audience engagements at peak moments in sporting events, movies, and more. By tracking real time biometric data from wearable wristbands embedded with sensors, Lightwave measures audience engagement at peak moments. Created by 29-year-old former DJ Rana June, Lightwave grew out of her frustration at not being able to read her audience’s engagement in real time. Major companies including Pepsi and Unilever have already utilized Lightwave to measure fan engagement at events like the NCAA basketball championship. 20th Century Fox also used Lightwave to track audience reaction to specific moments throughout The Revenant, providing much more precise feedback for the movie industry than self-reported metrics that were previously the benchmark for audience engagement.
Earlier this year Apple bought startup Emotient in a bid to take the lead in the emotion recognition race. Emotient uses facial recognition technology to read feelings. While we’re still waiting to see how Apple will integrate Emotient’s technology, it could be used to help advertisers read consumer reactions, or to help doctors interpret pain levels in patients. Emotient can accurately read different emotional responses based on faces in a large crowd of people, which could provide valuable feedback for marketing insights in a number of different fields.
Reading The Future of Feelings
Affectiva, Lightwave, and Emotient are just a few of the startups trying to solve the puzzle of how big data, artificial intelligence, and facial recognition algorithms can accurately interpret human emotion. The winning combination will likely combine facial recognition software with biometric sensors for more nuanced data feedback. It’s not hard to see how this would transform marketing, healthcare, and other industries. Once this technology goes mainstream, we’ll be one step closer to stepping over that invisible line that has long separated sci-fi’s version of A.I. from the very real future we are living into.