Your Phone Might Soon Be Able to Tell You If You're In Love

Your date kisses you goodnight. You turn to walk home. Your date heads in the opposite direction. You pull out your smartphone. You whisper, "I had a great time." Your smartphone runs a calculation that lasts only a millisecond. When it's done, the smartphone says aloud, "You are in love."

Although this scenario reads more like a Black Mirror episode than an honest assessment of future technology, a new report by Gartner Research predicts that we might only be five years away from similar artificial intelligence (AI)-based assessments. By 2022, your personal device will know more about your emotional state than your own family will, according to the "Predicts 2018: Personal Devices" report.

By taking the facial recognition software that unlocks your smartphones with a glance and combining it with voice analysis, your smartphone will be able to match your frowns, smiles, and tears with associated voice timbre to determine your true emotional state. This Emotion AI is in development by major companies including Amazon, Apple, and Google as well as other smaller vendors including Affectiva, audEERING, and Eyeris.

They are all experimenting with Emotion AI to find ways to let everyday objects "detect, analyze, process, and respond to people's emotional states and moods," according to the report. Affectiva, audEERING, and Eyeris are specifically focused on turning your car into an emotion detector to monitor your behavior while driving in order to offer you assistance, monitor safe driving behavior, or enhance the ride experience.

For business-to-business (B2B) tech, the possibilities are endless. Imagine being able to predict a customer's mood before he or she ever speaks to a customer service rep. What if customer relationship management (CRM) software could predict a prospect's likelihood to buy before a sales rep makes a pitch? By pulling emotional information from personal devices and bringing that data to the cloud, we'll be able to achieve what brands have tried but never had the tech to accomplish: 100-percent, unadulterated customer sentiment insight.

Here's How It Works

According to the report, Emotion AI detects 11 main emotions (anger, anxiousness, disgust, fear, happiness, jealousy, love, sadness, shame, surprise, and a neutral state) via facial expressions, intonation, and voice. The first wave of this tech will be driven by virtual personal assistants (VPAs) (think Alexa and Cortana). Soon, these systems will be able to add emotional intelligence for better context and an enhanced service experience, the report states. As of today, Google, IBM, and Microsoft are the primary tech behemoths investing in this area.

"None of the VPAs have this capability yet," said Annette Zimmermann, Vice President at Gartner Research. "But knowing Google and their capabilities, they are not far from it yet. Google and Amazon are probably the first to implement this. Microsoft has done some work on this so they have the capabilities as well. Apple with Siri is a bit lagging, hence I think they will be later with this."

The second stage of Emotion AI will allow things such as educational software, video games, diagnostic software, athletic and health performance, and autonomous cars to adjust experiences based on a user's emotional reporting. For example, the company Affectiva "has already been developing educational software that monitors a child's emotion during solving a problem," said Zimmerman. "Depending on if the child gets frustrated because the task is too difficult, the software could adjust the level of difficulty." Other possibilities include an autonomous car sensing your fear and lowering its velocity, or video games altering their difficulty level based on whether you're looking for a challenge or mindless, casual gameplay.

What's Next?

Today's brands spend billions of dollars monitoring social media websites such as Facebook and Twitter to determine what the general public feels about their products. Unfortunately, some of those tools are primarily designed to only distinguish between "positive" and "negative" sentiments while others are more advanced with "attributing to more nuanced emotional states yet still with an aggregate view," the report states.

Emotion AI will be better able to measure direct feedback and indirect feedback while providing deeper insight into attitudes toward brands and products. As companies consult tech vendors, certain issues will need to be worked out. For example, different cultures respond in different ways to different stimuli, so one person's anger could be read as excitement in one culture while being read as fear in another.

"There are differences based on cultures, but the companies who train these AI systems, like AudEERING and Affectiva, can accommodate for those differences. So, in the end, the system can recognize the correct emotions independent of age, gender, and culture," said Zimmerman.

Security will also need to be monitored closely. Imagine being able to hack a system to determine what gives a person deep shame or humiliation. You'd then be able to use that information to blackmail or embarrass someone. "You want the data that is used to train these systems to be completely safe and anonymized," said Zimmerman. "The companies I talked to adhere to these security measures. We need to trust these companies that they safeguard the data in the right way."

Moving forward, Gartner suggests tech providers such as Affectiva and AudEERING "take a consultative approach in thinking about foundational concepts of which emotional models to use and work with their customers—most of which will be new to even the concept of Emotion AI," according to the report.

For brands, Gartner suggests companies "add Emotion AI to [their] conversational system via computer vision or audio technology by using ready APIs [application programming interfaces] available from emotional AI vendors, such as audEERING, to enhance the user experience," and "help organizations to implement effective [voice of the customer] programs with consultative efforts and not just the technology that enables emotion analysis."

This article originally appeared on PCMag.com.