Julie Carpenter Studies Human-Robot Attachment Among Military Personnel

Too busy to vacuum your living room? Let Roomba the robot do it. Don't want to risk a soldier's life to disable an explosive? Let a robot do it.

It's becoming more common to have robots sub in for humans to do dirty or sometimes dangerous work. But researchers are finding that in some cases, people have started to treat robots like pets, friends, or even as an extension of themselves. That raises the question, if a soldier attaches human or animal-like characteristics to a field robot, can it affect how they use the robot? What if they "care" too much about the robot to send it into a dangerous situation?


That's what Julie Carpenter, who just received her UW doctorate in education, wanted to know. She interviewed Explosive Ordnance Disposal military personnel - highly trained soldiers who use robots to disarm explosives - about how they feel about the robots they work with every day. Part of her research involved determining if the relationship these soldiers have with field robots could affect their decision-making ability and, therefore, mission outcomes. In short, even though the robot isn't human, how would a soldier feel if their robot got damaged or blown up?

What Carpenter found is that troops' relationships with robots continue to evolve as the technology changes. Soldiers told her that attachment to their robots didn't affect their performance, yet acknowledged they felt a range of emotions such as frustration, anger and even sadness when their field robot was destroyed. That makes Carpenter wonder whether outcomes on the battlefield could potentially be compromised by human-robot attachment, or the feeling of self-extension into the robot described by some operators. She hopes the military looks at these issues when designing the next generation of field robots.

Carpenter, who is now turning her dissertation into a book on human-robot interactions, interviewed 23 explosive ordnance personnel - 22 men and one woman - from all over the United States and from every branch of the military.

These troops are trained to defuse chemical, biological, radiological and nuclear weapons, as well as roadside bombs. They provide security for high-ranking officials, including the president, and are a critical part of security at large international events. The soldiers rely on robots to detect, inspect and sometimes disarm explosives, and to do advance scouting and reconnaissance. The robots are thought of as important tools to lessen the risk to human lives.

Some soldiers told Carpenter they could tell who was operating the robot by how it moved. In fact, some robot operators reported they saw their robots as an extension of themselves and felt frustrated with technical limitations or mechanical issues because it reflected badly on them.

The pros to using robots are obvious: They minimize the risk to human life; they're impervious to chemical and biological weapons; they don't have emotions to get in the way of the task at hand; and they don't get tired like humans do. But robots sometimes have technical issues or break down, and they don't have humanlike mobility, so it's sometimes more effective for soldiers to work directly with explosive devices.

Researchers have previously documented just how attached people can get to inanimate objects, be it a car or a child's teddy bear. While the personnel in Carpenter's study all defined a robot as a mechanical tool, they also often anthropomorphized them, assigning robots human or animal-like attributes, including gender, and displayed a kind of empathy toward the machines.

"They were very clear it was a tool, but at the same time, patterns in their responses indicated they sometimes interacted with the robots in ways similar to a human or pet," Carpenter said.

Many of the soldiers she talked to named their robots, usually after a celebrity or current wife or girlfriend (never an ex). Some even painted the robot's name on the side. Even so, the soldiers told Carpenter the chance of the robot being destroyed did not affect their decision-making over whether to send their robot into harm's way.

Soldiers told Carpenter their first reaction to a robot being blown up was anger at losing an expensive piece of equipment, but some also described a feeling of loss.

"They would say they were angry when a robot became disabled because it is an important tool, but then they would add 'poor little guy,' or they'd say they had a funeral for it," Carpenter said. "These robots are critical tools they maintain, rely on, and use daily. They are also tools that happen to move around and act as a stand-in for a team member, keeping Explosive Ordnance Disposal personnel at a safer distance from harm."

The robots these soldiers currently use don't look at all like a person or animal, but the military is moving toward more human and animal lookalike robots, which would be more agile, and better able to climb stairs and maneuver in narrow spaces and on challenging natural terrain. Carpenter wonders how that human or animal-like look will affect soldiers' ability to make rational decisions, especially if a soldier begins to treat the robot with affection akin to a pet or partner.

"You don't want someone to hesitate using one of these robots if they have feelings toward the robot that goes beyond a tool," she said. "If you feel emotionally attached to something, it will affect your decision-making."

http://www.washington.edu

Featured Product

Robotmaster® 2024

Robotmaster® 2024

Program multi-robot cells and automatically solve robotic errors with ease. Hypertherm Associates announces a new version to its robotic programming software. Robotmaster 2024 addresses key market trends including the support for programming multiple robots in a single work cell and the demand for automatic trajectory optimization and robotic error correction.