A computer is being taught to interpret human emotions based
on lip pattern, according to research published in the International Journal of
Artificial Intelligence and Soft Computing. The system could improve the way we
interact with computers and perhaps allow disabled people to use computer-based
communications devices, such as voice synthesizers, more effectively and more
efficiently.
Karthigayan Muthukaruppanof Manipal International University
in Selangor, Malaysia, and co-workers have developed a system using a genetic
algorithm that gets better and better with each iteration to match irregular
ellipse fitting equations to the shape of the human mouth displaying different
emotions. They have used photos of individuals from South-East Asia and Japan
to train a computer to recognize the six commonly accepted human emotions -
happiness, sadness, fear, angry, disgust, surprise - and a neutral expression.
The upper and lower lip is each analyzed as two separate ellipses by the
algorithm.
