7 Startups

copyright by www.nanalyze.com

There has been a lot of talk lately about how machines just won’t be able to capture that “human element” of emotions or “emotional intelligence” as it is often called. The act of building an emotional quotient or EQ as a layer on top of AI is being referred to as affective computing, a topic we covered before . The first step towards AI being able to demonstrate emotional intelligence, is that it needs to see emotions in our behaviour, hear our voices, and feel our anxieties. To do this, AI must be able to extract emotional cues or data from us through conventional means like eye tracking, galvanic skin response, voice and written word analysis , brain activity via EEG , facial mapping, and even gait analysis. Here are 7 startups attempting to seamlessly integrate AI, machine learning, and emotional intelligence.

Beyond Verbal

The Israeli startup company uses a simple microphone and a set of questions to elicit 11 emotions that describe a person’s character traits. This form of emotional analytics “listens” to vocal intonations as they are spoken. The technology is intended for marketing professionals to run customer service experience and communication campaigns.

nViso SA

nViso is a privately-held Swiss startup  which was originally meant to develop an automatic prediction process for accurately categorizing patients needing tracheal intubation (plastic tube down the throat) for surgeries involving general anesthesia, a very costly procedure.

Receptiviti

Toronto-based startup, Receptiviti, is developing a proprietary technology they called Linguistic Inquiry and Word Count or LIWC2015 which looks at what you write and then uses it to gather insights about your character, emotions, level of deception, or even how you make decisions.

BRAIQ

The New York startup BRAIQ develops a technology that detects how you feel about that autonomous car that’s carting you around. While most of the design philosophies of self-driving vehicles are centered on how the vehicle interacts with its environment, BRAIQ is looking into how autonomous vehicles should interact with the passengers.

NuraLogix Corporation

The Toronto startup NuraLogix develops a technology that can read human emotions and can even tell if you’re lying. The startup developed a technique to “read” human emotional state called Transdermal Optical Imaging™ (TOI™) using a conventional video camera to extract information from the blood flow underneath the human face.

Emoshape

Founded in New York, the startup Emoshape are developing what they are calling an Emotion Processing Unit (EPU). The startup aims to provide intelligent objects like robots, self-driving cars, affective toys or even ordinary consumer electronic products capabilities to interact with humans via a microchip they call the Emotion Processing Unit (EPU).


Thank you for reading this post, don't forget to subscribe to our AI NAVIGATOR!


 

CrowdEmotion

CrowdEmotion is a privately-held London-based startup with a technology to gauge human emotions visually by mapping the movements of 43 muscles on the human face. CrowdEmotion currently offers the technology over the cloud. You can upload a video to their website and then it will give you some running commentary on the sort of emotions you’re exhibiting  […]

read more – copyright by www.nanalyze.com