Researchers at the Massachusetts Institute of Technology (MIT) in the United States, have developed a computer program to recognize natural and fake smiles, which exceeds the human observer in identifying the smile out of frustration.
To do this, the system uses an algorithm that keeps track of movements of the muscles involved in facial expressions, different spontaneous gestures, gestures regarding labor. This technology could be applied in the education of children with autism.
The psychology behind the act of smiling is very complex, especially when done in order to conceal or misrepresent something. The simulation is not at all simple addition, which highlights the difficulty when not feeling the corresponding emotion.
However, there are major players in this field, whose expressions can be passed as true to human eyes, but not before a suitably trained machine.
This has been proven a team of researchers from the Massachusetts Institute of Technology (MIT) in the United States, which has carried out a follow-up study with a group of volunteers to test an automated system capable of distinguishing between those generated smiles spontaneously and naturally against false or forced expressions.
Check — Cartoon Yourself App
Explains the MIT in a statement, research also confirmed that most people do not expect a smile in response to frustration, being an involuntary expression. Thus, the new algorithm allows to differentiate the smiles of joy from those caused by frustration with much greater precision than human observers.
“The goal is to help people in oral communication,” says Ehsan Hoque, Student’s Affective Computing Group at MIT Media Lab and author of the research, whose findings were recently published in the journal IEEE.
Scientists created two test cases to cause two opposing affective states: frustration and joy. Thus, in the first experiment asked participants to recall situations while expressing their joy or frustration, while the second tried to provoke these states naturally. All this is while being recorded with a webcam.
On one hand, subjects had to fill in a long form online, which is automatically erased as they pressed the send key. The majority (90 percent) smiled in frustration at this, but when asked to take a fiasco expression, the same percentage did not smile. Similarly, happy smiles studied produced when subjects were shown a video of a very cute baby, and also those adopted without emotional provocation.
These same volunteers were then asked to interpret images of people or pretending really smiling a smile of joy or frustration. While classification of smiles out of frustration was rather haphazard, we obtained 92 percent accuracy through the algorithm. This is precisely what differentiates this study from previous studies, usually based on feigned expressions of emotion, which can lead to erroneous results.
“The simulated cases are much easier to classify, but when it comes to interpreting images of real answers, people tend to respond randomly, estimating correct in only 50 percent of cases,” said Hoque. The same links the effectiveness or better machine performance that it focuses exclusively on the mechanical features of the smile, while men tend to pay attention to other signals.