MedCity News eNewsletter, Health IT

Emotion recognition technology shows promise for telemedicine

Emotion recognition technology can help in the world of telemedicine and day to day life allowing people to understand what others are feeling even though they may not be saying it.

Almost every smartphone and tablet possess face recognition technology through a camera, so what could possibly be better than that? Try emotion recognition. New technology is now able to read your emotions based on different facial cues.

In the world of telemedicine where patients are evaluated over mobile platforms, the ability for a medical professional to discern what the patient is feeling can be useful in the healing process.

Telemedicine vendors with an interest in moving more into specialized areas like telepsychiatry, like Doctor on Demand, can especially benefit from emotion recognition technology so they can understand what their patients are feeling even if the patient isn’t physically present or is not explaining their emotions to a psychiatrist or psychologist.

Affectiva‘s emotion recognition technology involves using a laptop, tablet or smartphone camera to evaluate the user’s facial cues. Its algorithm reads those cues to produce the emotion the user is experiencing in real time.

Companies with a vested interest in understanding their target customers could benefit from emotion recognition by having the ability to understand customer and audience reactions to specific content. This ability can help answer a range of questions from determining  whether a specific type of therapy works or to assess customers’ response to a specific campaign and if it matches the anticipated reaction.

In a TED Talk, Affectiva CSO and co-founder Rana el Kaliouby said, “Our emotions influence every aspect of our lives, from our health and how we learn to how we do business and make decisions…Today’s technology has lots of cognitive intelligence but no emotional intelligence.”

Affectiva’s algorithm reads a user’s Action Unit — for example, a lip curl up for a smile or an eyebrow furrow — and generates an emotion based on 12 billion emotion data points from 2.9 million face videos from 75 countries around the world.

As far as emotion-enabling technology in the healthcare world, Kaliouby explained in the TED talk that, “Emotion-enabled wearable glasses can help those who are visually impaired read the faces of others and can help individuals on the autism spectrum interpret emotion.”

Some health tech companies have developed ways of assessing a person’s emotional state through their voice patterns, such as Cogito. Sharecare acquired Feingold earlier this year to use artificial intelligence to assess emotion. Ginger.io used the wide popularity of mobile phones to track users’ activity level to determine whether or not they are depressed (for example, how many text messages they send).

It’s interesting to consider the impact of this technology if it ever became mainstream. Individuals would be able to use it to better understand people they come into contact with in their day-to-day lives.

Featured Photo: Flickr user GollyGforce

Shares0
Shares0