Header Ads

Smartphone That Knows You’re Angry


Researchers at Samsung have developed a Smartphone that can detect people’s emotions.  Rather than relying on specialized sensors or cameras, the phone infers a user’s emotional state based on how he or she is using the phone.

For example, it monitors certain inputs, such as the speed at which a user types, how often the backspace or special symbol buttons are pressed and how much the device shakes.

These measures let the phone postulate whether the user is happy, sad, surprised, fearful, and angry or disgusted, says Hosub Lee, a researcher with Samsung Electronics and the Samsung Advanced Institute of Technology’s Intelligence Group in South Korea.  Lee led the work on the new system.

He says that such inputs may seem to have little to do with emotions, but there are subtle correlations between these behaviors and one’s mental state, which the software’s machine-learning algorithms can detect with an accuracy of 67.5 percent.

The prototype system, presented in Las Vegas at the Consumer Communications and Networking Conference, is designed to work as part of a Twitter client on an Android based Samsung Galaxy S II.  It enables people in a social network to view symbols alongside tweets that indicate that person’s emotional state.

But there are many more potential applications, Lee says.  The system could trigger different ringtones on a phone to convey the caller’s emotional state or cheer up someone who’s feeling low.  The Smartphone might show a funny cartoon to make the user feel better, Lee says.

Further down the line, this sort of emotion detection is likely to have a broader appeal. Lee says, “Emotions recognition technology will be an entry point for elaborate context aware system for future consumer electronics devices or services, he says”.  If we know the emotion of each user, we can provide more personalized services.

Samsung’s system has to be trained to work with each individual user.  During this state, whenever the user tweets something, the system records a number of easily obtained variables, including actions that might reflect the user’s emotional state, as well as contextual cues  - such as the weather or lighting conditions – that can affect  mood,  Lee Says.

The subject also records his or her emotion at the time of each tweet.  This is all fed into a type of probabilistic machine-learning algorithm known as a Bayesian network, which analyzes the data to identify correlations between different emotions and the user’s behavior and context.

The accuracy is still pretty low, Lee says, but then the technology is still at a very early experimental stage and has only been tested using inputs from a single user.  Samsung won’t say whether it plans to commercialize this technology, but Lee says that with more training data, the process can be greatly improved.    “Through this, we will b able to discover new features related to emotional states of users or ways to predict other affective phenomena like mood, personality or attitude of users, he says. (NYT)




Source: Manila Bulletin



























No comments

Powered by Blogger.