Affective Computing

One of the more interesting technologies that has been developing is called affective computing. It’s about analyzing observations of human faces, voices, eye movements and the like to understand human emotions — what pleases or displeases people or merely catches their attention.  It combines deep learning, analytics, sensors and artificial intelligence.

While interest in affective computing hasn’t been widespread, it may be nearing its moment in the limelight. One such indication is that the front page of the New York Times, a couple of days ago, featured a story about its use for television and advertising. The story was titled “For Marketers, TV Sets Are an Invaluable Pair of Eyes.”

But the companies that were featured in the Times article are not the only ones or the first ones to develop and apply affective computing. IBM published a booklet on the subject in 2001.  Before that, in 1995, the term “affective computing” was coined by Professor Rosalind Picard of MIT, who also created the affective computing group in the MIT Media Lab.

In a video, “The Future of Story Telling”, she describes what is essentially the back story to the New York Times article.  In no particular order, among other companies working with this technology today, there are Affectiva, Real Eyes, Emotient, Beyond Verbal, Sension, tACC, nVisio, CrowdEmotion, PointGraB, Eyeris, gestigon, Intel RealSense, SoftKinetic, Elliptic Labs, Microsoft’s VIBE Lab and Kairos.

Affectiva, which Professor Picard co-founded, offers an SDK that reads emotions of people at home or in the office just by using web cams.  Here’s a video that shows their commercially available product at work: https://www.youtube.com/watch?v=mFrSFMnskI4

Similarly, Real Eyes also
offers a commercial product that analyzes the reactions of what people
see on their screens. Here’s their video about real-time facial coding: https://www.youtube.com/watch?v=3WF4eG1s44U&list=PL1F3V-C5KJZAxl8OGF0NjbG8WHTTE6_hX

The
two previous products have obvious application to web marketing and
content. So much so, that some predict a future in which affective
technology creates an “emotion economy”.

But affective computing
has longer term applications, most especially in robotics. As human-like
robots, especially for an aging population in Asia, begin to be sold as
personal assistants and companions, they will need to have the kind of
emotional intelligence about humans that other human beings mostly have
already. That’s likely to be where we will see some of the most
impactful uses of affective computing.

Over the last couple of
years, Japan’s Softbank has developed Pepper, which they describe as a
“social robot” since it aims to recognize human emotion and shows its
own emotions. Here’s the French software company behind Pepper  — https://www.youtube.com/watch?v=nQFgGS8AAN0

There
are others doing the same thing. At Nanyang Technological University,
Singapore, another social robot, called Nadine, is being developed.  See
https://www.youtube.com/watch?v=pXg33S3U_Oc

Both
these social robots and affective computing overall still needs much
development, but already you can sense the importance of this
technology.

© 2017 Norman Jacknis, All Rights Reserved

[http://njacknis.tumblr.com/post/157863647250/affective-computing]

Leave a Reply

Your email address will not be published. Required fields are marked *