January 11, 2016

Will computers ever truly understand what we’re saying?

Asimo, built by Honda and the world’s most advanced robot, can run, climb stairs
and converse, but it still confuses a hand raised with a question and a hand raised
to take a photo. Courtesy of Honda.

(January 11, 2016)  From Apple’s Siri to Honda’s robot Asimo, machines seem to be getting better and better at communicating with humans.

But some neuroscientists caution that today’s computers will never truly understand what we’re saying because they do not take into account the context of a conversation the way people do.

Specifically, say University of California, Berkeley, postdoctoral fellow Arjen Stolk and his Dutch colleagues, machines don’t develop a shared understanding of the people, place and situation – often including a long social history – that is key to human communication. Without such common ground, a computer cannot help but be confused.

“People tend to think of communication as an exchange of linguistic signs or gestures, forgetting that much of communication is about the social context, about who you are communicating with,” Stolk said.

As two people conversing rely more and more on previously shared concepts,
the same area of their brains – the right superior temporal gyrus
– becomes more active (blue is activity in communicator, orange is activity in interpreter).
This suggests that this brain region is key to mutual understanding as people
continually update their shared understanding of the context of the conversation
to improve mutual understanding.

The word “bank,” for example, would be interpreted one way if you’re holding a credit card but a different way if you’re holding a fishing pole. Without context, making a “V” with two fingers could mean victory, the number two, or “these are the two fingers I broke.”

“All these subtleties are quite crucial to understanding one another,” Stolk said, perhaps more so than the words and signals that computers and many neuroscientists focus on as the key to communication. “In fact, we can understand one another without language, without words and signs that already have a shared meaning.”

A game in which players try to communicate the rules without talking or
even seeing one another helps neuroscientists isolate the parts of
the brain responsible for mutual understanding.

Babies and parents, not to mention strangers lacking a common language, communicate effectively all the time, based solely on gestures and a shared context they build up over even a short time.

read entire press  release >>