‘K-9’ is a robot reminiscent of familiar characters like R2D2 from Star Wars or Eva from Wall-E. Although far more imposing in size at 5’3” and 400 pounds, K-9 has the same white rounded figure, lumbering along at 3 MPH. However, the K-9 also has a much more authoritarian purpose than its friendly movie counterparts. Knightscope, the company that developed and rents out K-9, was founded by Stacy Stephens, a former police officer who imagines the place of robots in the future of security and surveillance in unsettling terms:
“It’s just about having an authoritative presence…When I was a police officer one of the first things they teach you is you have to have a use of force, and that first use of force is a commanding presence.”
Public organizations in cities such as San Francisco, Washington, and more, have recently begun using these security robots to patrol their streets and ward off the houseless. Renting out the robots, which costs approximately $6 an hour, is seen as an attractive cost-saving option over hiring a human security guard.
However, the use of robotic security came under increased scrutiny last month when the San Francisco SPCA hired out K-9 units, and suggested that the robots were an effective deterrent against the local houseless population. “We weren’t able to use the sidewalks at all when there’s needles and tents and bikes, so from a walking standpoint I find the robot much easier to navigate than an encampment,” Jennifer Scarlett, the S.F. SPCA’s president, told the Business Times. Scarlett’s classist statements brought the issue of robotic security to national attention, sparking widespread debate over who has the right to public spaces.
Choosing Empathy Over Fear
The problems with Knightscope and robotic policing are not too far afield from the problems with overpolicing in general. The SPCA cited recent break-ins and vandalism as a primary driver for piloting the Knightscope program, demonstrating how policing is often more concerned with the rights of private property than the safety and wellbeing of local houseless citizens. Knightscope capitalizes off of stoking our fear of the other, rather than attempting to address any of the underlying drivers of crime and houselessness (poverty, institutional racism, displacement, etc).
The conflict between the SPCA and the local homeless population is emblematic of San Francisco’s tendency to court wealth through its technology-driven economy and the ensuing gentrification that is pushing many long-time residents out of their homes and into the streets. According to Housing and Urban Development secretary Shaun Donavan, it costs approximately $40,000 per year in taxpayer money for a houseless person to be on the streets. Choosing to pay for highly advanced robots that harass the houseless, rather than using that money to fix the issues that lead to poverty, reflects the twisted logic of capitalism. Giving people the food, shelter and care that they need to survive could literally pay for itself. This makes the increasing use of Knightscope all the more disturbing.
On Knightscope’s own website, they claim that their security robots will not replace security guards, but the fact is that these robots carry out a function that would have otherwise been filled by human beings. Moreover, Knightscope drones can be rented for less than half the hourly rate of a security guard, demonstrating a perfect example of good jobs being outsourced and outcompeted by machines. In fact, according to a recent article by Fortune, robots are predicted to steal approximately 40% of all American jobs by the year 2030. As a result, this nation is facing an increasingly urgent choice of whether we will leave livelihoods for our children, or rob the future generations for the sake of literal corporate machines. Knightscope’s robotic security does not just hurt the houseless—the whole community suffers.
Of course, there is also a third option. Robots that replace human jobs are largely detrimental due to capitalism, which elevates the few elites who control the robots, at the expense of everyone else. The same robots under socialism would be a cause for celebration, as they would free us from labor in which we have become obsolete. Notably, however, this does not negate the threat of Knightscope’s robotic security. Whether over-policing is carried out by people or machines, it is an issue that has proven time and again to be classist, racist, and ultimately, dangerous to everyone involved.
Using autonomous drones to monitor the public at all times also reveals a complete lack of public trust. As Stephens admitted, the purpose of the robot is not to intervene in any criminal activity, but simply to create an atmosphere of constant surveillance, an implementation of a 21st century panopticon. In the words of Stephen Fry, “You are who you are when nobody’s watching.” Thus, without privacy, we lose the ability to truly know ourselves and develop our identities as free individuals. Knightscope may be a mere symptom of American’s issues with over-surveillance, but it seems impossible that even the most innocent person can have their actions constantly recorded by law enforcement—under the threat of arrest—without being negatively affected by an environment of relentless suspicion. When we foster a culture of fear, rather than empathy, we begin perceiving the world in false binaries: the innocent versus the criminal, those who need to be protected vs. those who present a threat. It’s within these stark dichotomies that houseless people, those who are often the most vulnerable in our society, become criminalized for simply trying to live.
To make matters worse, Knightscope’s history of achievement is checkered at best. In 2016, for example, one robot knocked down a toddler and ran over his foot without apparently noticing what it had done. Then in 2017, another robot committed “suicide” by driving straight into a fountain. As easy as it would be to mock Knightscope for these malfunctions, their mistakes have put people in very real danger. Robotic security cannot be reasoned with, it cannot be error-free, and ultimately, that means that it cannot be trusted.
Human vs. the Machine
Thus, beyond the obvious issues of classism, Knightscope presents a grave threat to American livelihoods, community trust, and even our most personal identities. As a result, the real question is this: how do we defend ourselves? The best solution, of course, is to get rid of robotic security altogether. As citizens, we have an obligation to stand up against the criminalization and mistreatment of our houseless neighbors. Anything less will ultimately make us complicit as bystanders of injustice.
Calling your representatives to demand legislation that restricts and opposes robotic security is a good place to start. The rise of the surveillance state has virtually eliminated the right to privacy already, and without regulation, the advent of artificial intelligence may prove to be the finishing blow. Indeed, the City of San Francisco has already ordered the S.F. SPCA to stop using these security robots altogether or face a fine of $1,000 per day for operating in a public right of way without a permit. However, permits are not difficult to acquire, and still more regulation will be necessary as robotic surveillance continues to evolve.
But even if reason fails, we do not have to accept this dystopia without a fight. While this article certainly cannot advocate for anything illegal, it is worth acknowledging the options of civil disobedience. After all, if cities will not listen to the demands to keep robotic security out of public space, then citizens may have little choice but to remove the robots on their own. In fact, people have started taking such actions, with several acts of vandalism already committed against K-9.
Of course, simply smashing a 400-lb robot is easier said than done. The robots are equipped with 360-degree vision and the ability to recognize license plates and identify nearby mobile devices. The solution, therefore, is to use these strengths against them. For example, rather than directly attacking a robot, which would be difficult to destroy—or even knock down—before the police arrived, one could instead opt to target the charging stations that are scattered throughout the city. The robots’ ability to identify mobile devices also makes them potentially vulnerable to hacking, though this would not be advisable for anyone other than an expert who knows what they’re doing. Finally, the robots are slow, with a maximum speed of approximately 3 MPH, and no offensive capabilities beyond calling for help. As a result, the simplest way to dispatch a robot would be by attacking the sensor systems—for instance, by spray-painting the cameras or smashing through these vulnerable points with a screwdriver.
Now, in a more literal sense than ever, it may be time to rage against the machine.