Have you ever gotten mad at your GPS system for answering back at you in a nice calm happy voice when you're driving around frustrated because you're lost? What if the navigation system could sense when you were mad or angry? That is what scientists are trying to do now.
In a Cairo school basement, 24 women analyzed facial expressions on their computers. They trained their computers to recognize tense moods of anger, sadness and frustration. At Cambridge University a robot named Charles sat in a simulator crinkling his eyebrows trying to look interested or confused.
Now studies are being shown in classrooms to see student's emotions. Researchers would like to see if a computer system can determine if a student is interested in learning or getting bored. The computer system would also help a teach determine when his or her students are losing interest in the topic and if it is time to change the lesson plan to be more engaging for the students.
This technology could also be useful in an airport. The airport security system can not determine what someone could be hiding such as their intentions in that airport or on that flight. If a system can see a person's emotion is mysterious or if they are being mischievous this can help airports cut down their crime rate.
Another useful need for the technology would be with autistic children. If there is a machine that can help these children read, one day it can read the child. Maybe it will see the child is getting frustrated and can support him or her. Also, since some autistic children can't speak, these machines can help speak for their emotions.
One of the directors of a research group at MIT, Rosalind Picard, said she has been working on this project for more than two decades. She is trying to translate emotions into 1's and 0's. This is the language of machines.
Another earlier project she did was with collaborator Rana el Kaliouby. They designed glasses for older people with Asperger syndrome. This is a mild variant of autism. These glasses warned them if they were boring someone.
Their recent project directs the system to 24 different points of the face that would detect emotion. They needed to have thousands of examples of facial expressions so that they can detect the many different faces and expressions without confusion. For example, someone can be sarcastically smiling with a grit on their face or someone can genuinely be smiling.
A reason a computer detecting emotions can be helpful is because one day it can provide better learning experiences. Right now nothing can detect if a child is bored or befuddled. However, the system can provide the bored child with more challenging problems and simpler problems to the one with more stress.
There is a basic language of 1's and 0's that scientists use to communicate with computer systems. Scientists are figuring out now how to compute our emotions into 1's and 0's to help the computer have a better understand of what we feel.
Are you ready to let robots or computers detect your mood? Do you think this can help us understand the people around us better?
article used : http://www.nytimes.com/2012/10/16/science/affective-programming-grows-in-effort-to-read-faces.html?pagewanted=2&ref=technology