Reprint of my article published by HealthyPlace.com on January 28, 2014.
More often than
not, when people see others yawning, they find themselves yawning as well. This
phenomenon is known as social yawning and it involves a deeper set of emotions.
Yawning in this scenario reflects a person’s empathy for another. Such instinctual
display of empathy usually strengthens the social group and the relationship
among individuals. However, recent research shows that contagiousyawning is not always the case for people on ASD spectrum.
Research offers
many explanations for the deficiency to perceive emotions typical for ASD
population. Most dominant one is that autistic children tend to confuse the
expressions being displayed and therefore find it difficult to interpret them
successfully.
In 2011, I was
visiting MIT Media Lab and met Dr.Rosalind Picard, an MIT
Professor, who leads a number of research projects on assistive technologies
for people with autism. Dr. Pickard tells us that many autistic children are
brilliant in reading facial expressions if they analyze them on a computer or observe
another person from a distance. The distinction, however, arises when we try to
measure face-to-face interaction. An autistic child focuses hard on
comprehending what we are saying when we talk to them and therefore ignores our
facial expressions.
To help autistic
children counter these challenges, Picard and her team at MIT Media Lab are trying
to develop special assistive technology for expression analysis. The software
uses six affective-cognitive mental states defined by Professor Baron-Cohen
from the University of Cambridge: Agreeing, Concentrating, Disagreeing,
Interested, Thinking and Unsure. The technology tracks the facial points, monitors
face transitions, records the head poses and extracts the facial features. As
the facial expressions change, the software keeps recording the degree of each
emotion as seen in the different expressions. Professor Picard emphasizes the
importance of dynamic analysis for face transitions. The problem is that static
face expressions are not always representative of the expressed emotion and it
is the history of face transitions that gives us cues to deciphering another
person. For example, if someone looks confused as they didn't understand or
missed something in our speech, we might mistakenly perceive their facial
expression as disagreement with our statements.
"Emotional Intelligence, Technology & Autism", Rosalind Picard, MIT
It turns out that,
based on the dynamic analysis of facial transitions, the computer can easily
detect what the person is feeling. When tested on different categories of contexts
and behaviours, the computer software developed at MIT Media Lab appeared to be
more successful in recognizing facial transitions than people in general. This
technology is a scientific breakthrough and marks a significant step towards availability
of mainstream assistive tools for individuals with Autism.
Dr. Mari Davies and
Dr. Susan Bookheimer, neuropsychology researchers from the University of
California, Los Angeles, conducted a study to compare the brain activity of 16 typically
developing children and 16 high-functioning autistic children. These children
were subjected to a series of faces showing emotions of anger, fear, happiness
and neutral expressions while undergoing Functional Magnetic Resonance Imaging.
Half of the faces had their eyes averted, the other half stared directly back
at the children.
It was found that,
Ventrolateral Prefrontal Cortex (VLPFC), the part of the brain which evaluates
emotions, became active when the direct-gaze faces came up and quieted down
when the averted-gaze faces were displayed to the typically developing
children. However, the autistic children showed no reaction to either set of
faces. This shows that autistic children do not perceive any difference in
emotion whether the face stares back at them or looks away from them.
Emotions are of
second nature to the typically developing children; however, for autistic
children recognizing emotions is a very difficult process. Yet, autistic
children are often able to recognize simple emotions. In a study conducted by Professor
Baron-Cohen, it was found that autistic children could make out faces that
showed happy or sad emotions but had difficulty identifying faces carrying
expressions of surprise or fear.
According to Dr. Angelique
Hendriks from Radboud University, the reason for this deficiency could be a weak central coherence. This term defines
the inability of autistic children to combine the parts of information or
signals they receive into one whole coherent picture. This is why they treat
different parts of information separately and are unable to connect and relate
them to the situation at hand.
Dr. Ellie Wilson in her PhD research at Macquarie University tested the
hypothesis of whether autistic children can match images onto real life people.
The study demonstrated that the
key difference with neurotypical children is in the way autistic children move
their eyes around the face. It may be possible that training might improve
their recognition skills, though the results from a few training studies in the
past haven’t been particularly convincing.
Among many problems
faced by autistic children, having no perceptual ability to read facial
expressions is the most serious and pressing of them all. Researchers and technologists
are working together to develop mechanisms which will aid the learning of
autistic children and help them navigate in the social world.
No comments:
Post a Comment