(September 21, 2015) Robots are increasingly
being considered for use in highly tense civilian encounters to minimize person-to-person
contact and danger to peacekeeping personnel.
Trust, along with physical qualities and cultural considerations, is an
essential factor in the effectiveness of these robotic peacekeepers. New
research to be presented at the HFES 2015 Annual Meeting in Los Angeles in
October examines the importance of social cues when evaluating the role of
trust in human-robot interaction.
Joachim Meyer, coauthor
of “Manners Matter: Trust in Robotic Peacekeepers” and a professor at Tel Aviv
University’s Department of Industrial Engineering, notes that “interactions between machines and people
should follow rules of behavior similar to the rules used in human-to-human
interaction. Robots are not seen as mindless technology; rather, they are
considered agents with intentions.”
Meyer and coauthor Ohad
Inbar asked 30 participants to report first impressions of a humanoid
peacekeeping robot interacting with individuals using varying levels of
politeness. The scenario they evaluated depicted the robot in charge of inspecting
people who were trying to enter a building.