The device “quietly observes your behaviour” while you use the computer or phone as usual.
It does not require the user to explicitly state what he or she is feeling, input any extra information or wear any special gear.
For example, the team was able to measure a user’s heart rate simply by monitoring very small, subtle changes in the user’s forehead colour. “The system does not grab other data that might be available through the phone – such as the user’s location,” said JieboLuo, professor of computer science.
The researchers were able to analyse the video data to extract a number of “clues” such as heart rate, blinking rate, eye pupil radius and head movement rate.
At the same time, the program also analysed both what the users posted on Twitter, what they read, how fast they scrolled, their keystroke rate and their mouse click rate.
“Not every input is treated equally though: what a user tweets, for example, is given more weight than what the user reads because it is a direct expression of what that user is thinking and feeling,” Luo noted.
For the study, they enrolled 27 participants and “sent them messages, real tweets, with sentiment to induce their emotion.”
This allowed them to gauge how subjects reacted after seeing or reading material considered to be positive or negative. Their program currently only considers emotions as positive, neutral or negative.
Luo says he hopes to add extra sensitivity to the program by teaching it to further define a negative emotion as, for example, sadness or anger. The paper is set to be presented at the American Association for Artificial Intelligence conference in Austin, Texas.