A rise in artificial intelligence and cognitive computing is creating a new workforce of robots, simulating human thought and transforming industries. At the apex of this emerging tech is a new field of sensory technology known as emotive computing.
This is not teaching robots to have emotions. Rather, it is about teaching them to recognize human emotions, based on signals, and then react appropriately based on an evaluation of how the person is feeling. Robots may actually be more useful than humans in this role, as they are not clouded by emotion, instead using intelligent technology to detect hidden responses.
In the last three years, there has been an emergence of new businesses pioneering facial recognition technology in the classroom. Companies like SensorStar Labs use cameras to capture student responses, which feed into algorithms to identify if their attention is wandering. The system, called EngageSense, measures smiles, frowns and audio to classify student engagement.
Psychologist Paul Ekman has taken this to a whole new level, cataloging more than 5,000 facial movements to help identify human emotions. His research is powering new companies like Emotient Inc, Affectiva Inc and Eyeris, each using a combination of psychology and data-mining to detect micro expressions and classify human reactions.
So far this technology has focused on aiding federal law enforcement and market research, but San Diego researchers are also trialling this technology in healthcare, to measure children’s pain levels after surgeries.
Applying this in the classroom means teachers can gather more in-depth data to measure understanding. This can be used on a one-to-one level but also to assess class engagement as a whole, in response to varying teaching methods, informing teachers where additional support may be required.