Next Generation AI: Emotional Recognition

in #news7 years ago

Emoticons.jpg

The 4th annual World Internet Conference was held this past winter. The conference took place in Wuzhen, China. It featured business leaders from around the world, including Apple’s Tim Cook, Google’s Sundar Pichai and Alibaba’s Jack Ma. The leaders met to discuss the latest internet developments. Tech companies also showcased their more recent products and advancements.

Alibaba demonstrated perhaps the most interesting development at the conference. Alibaba presented a prototype supermarket called Tmall. The store was completely unmanned. Customers were able to pay for purchases automatically, with no need for cash or credit card. Customers simply chose items and left the store. Payment was made electrically through sensors and cell-phone technology. The store had no check-out lines.

While the new payment-technology was impressive, Tmall also featured cutting-edge “emotional recognition” and eye-tracking technology. Alibaba mounted video cameras throughout the store. Those cameras used complicated algorithms to analyze customers’ emotions while they shopped. The customers’ emotional responses to specific items were then used for pricing. Tmall charged lower prices when customers were happy and smiled for those items. For instance, a customer who smiled while looking at a handbag got a discount.

Emotional recognition (also called facial coding) is an emerging branch of artificial intelligence. AI researchers around the world have been perfecting the technology. Massachusetts-based Affectiva is a leader in the field. Affectiva provides information on emotional responses to digital sources (i.e. websites, apps, etc.). The company’s software development kit (“SDK”) allows website and app-operators to obtain real-time reactions by visitors. Alternatively, visitors’ faces can be recorded as they navigate a site or app. The recordings are then sent to Affective for expert analysis.

Affectiva claims that its products can “[a]dd emotion awareness to apps, games and other products. Allow devices to respond to users’ emotions in real-time.” Users only need an optical sensor or a standard webcam. Facial landmarks (including the position of an individual’s eyebrows, nose and mouth) are first analyzed. Affectiva engineers, as well as behavior scientists, know that facial landmarks change depending on emotion. Once analyzed, faces are compared to a database of 5+ million other faces from 75 nations. Algorithms analyze faces for emotions, expressions, age, gender and ethnicity. Seven basic emotions can be detected - anger, contempt, disgust, fear, joy, sadness and surprise. Affectiva claims a success rate “in the high 90th percentile.” Both individuals and groups of individuals can be analyzed.

Use of emotional recognition permits developers to customize websites, apps and games, depending on users’ emotional responses. Developers can actually see into the mind” of users and determine their likes and dislikes in real-time.

Programs detecting “speech emotion” are also under development. Speech capabilities include detecting changes in tone, loudness, tempo and voice quality. Once again, Affective is a leader in “speech emotion” technology. It hopes to soon offer products combining both emotional recognition and speech emotion.

*** Please upvote this post if you found it informative. We'd love to continue offering interesting and useful posts. ***