The New York Police Department experimented with a dog robot, nicknamed Digidog. Just a few cases saw the robot in action, including one situation involving hostages in the Bronx, and a case involving a public housing building in Manhattan.
A backlash from the public, and eventually, public officials, followed after word of the Digidog came to the surface. The robot's cost was deemed excessive by some. Some feared civil liberties would be undermined by its use. Others thought it was just plain creepy.
With an abrupt termination of its lease and decision to cease using the robot, the NYPD stopped using the robot in April. Nevertheless, several other police departments in the U.S. have also been testing robot models. “Spot has been particularly resourceful in tackling dull, dirty and dangerous tasks,” the Boston Dynamics spokesperson told Scientific American. “Public safety initiatives, including police departments, often face dangerous work, such as inspecting a bomb, rummaging through remnants of an explosion or fire, or even deescalating a potentially dangerous situation.”*
What factors affect our feelings toward robots? A popular robot seal called PARO is adored by people, for example, while Digidog is disliked.
Design of the robot, context in which the robot is deployed, and user input are all factors that come into play. In addition to providing humans with more social activities, the PARO robot also seems to boost morale. That's not the way Boston Dynamics built its robot. There is no face on them. They don't have fur or are squishy. The design of something has an impact on people's reactions.
Additionally, context is very important when using something. Humans exhibited great sympathy for the Boston Dynamics robots that caused trouble with the New York [City] Police Department just [a few] years earlier when Boston Dynamics engineers posted a video kicking the robot. There was a flood of emotion over these videos with people saying, "Poor Spot" because of the context in which it was used. That robot did not evoke the same emotional response as the police's Digidog robot because of the context in which it was used. Also, there is the use of these things by users. A robot can be designed to achieve your desired purposes, but if its users fail to utilize it accordingly, that robot can turn into something else entirely.
Are there particular aspects of robots that make humans anxious?
In the end, robots move. We humans elicit quite a few expectations about what an object will do when it moves. There were studies done using very simple animated characters in a piece of film by [psychologists] Fritz Heider and Marianne Simmel in the 1940s. People gave personality to [a] triangle as opposed to [a] square when shown this to test subjects. Moreover, the differences weren't that the shapes seemed to have personalities. Differentiating them was their movement. The robots' movements in physical space carry a great deal of meaning for us because of how they convey information about social positioning and expectations.
What was so special about this particular robot that caused the public backlash against the NYPD?
Again, there are a number of factors involved. The design itself is one of them. I think you'll agree that the robot looks quite imposing if you've seen pictures of it. There's not much difference between it and the science fiction robots. Although it has an imposing profile, the way it flies through space and navigates can often be considered creepy by many human observers.
There’s also the context of use. The NYPD used this robot, very famously now, [at] a public housing project. That use of the robot in that place, I think, was a really poor choice on the part of the NYPD—because already you’re talking about police officers entering a public housing facility, now with this massive technological object, and that [exacerbates the] very big power imbalance that’s already there.
Thirdly, timing is everything. These developments have been occurring alongside increased public scrutiny of policing practices-including the militarization of police-and [how] the police have responded to minority populations in very different ways from how the police have responded to white populations.
In an episode of Black Mirror, robotic dogs hunt humans. This episode inspired some critics to use science fiction to criticize Digidog. How do stories shape our reaction to technology?
It is really crucial to address this question of science fiction. From the Czech word robota we get ourword robot. As such, the very notion of robot is integrally linked to science fiction, and it's impossible to separate it from it - because that's where it first originated.
The public has also already predicted what robots will be like in science fiction because we first see them in science fiction before we see them in social reality. Using science fiction as a means to explain what they're doing and why is what robotists refer to as "science-fiction prototyping." Robotists benefit greatly from it because they can explain to their audiences what they're designing and why.
Are robots ever going to be accepted for use in law enforcement?
A new scenario is emerging here, I believe. A vital part of the police department's decision making will be whether to integrate these things or not. Had Digidog been brought to a housing project to support police action, rather than being used to rescue someone from a burning building, the response would have been very different. If the robot had been used as a bomb disposal unit, the outcome would have been very different. My conclusion is that not only will the robot's design affect how police interact with their communities, but the timing, context and positioning of this device will also affect how and where police use it.
Buschwich, Sophie. “The NYPD’s Robot Dog Was a Really Bad Idea: Here’s What Went Wrong” Scientific American,https://www.scientificamerican.com/article/the-nypds-robot-dog-was-a-really-bad-idea-heres-what-went-wrong/. Accessed 21 September 2021.