When the First Industrial Revolution was taking off in England, many factories started replacing the men, women and children working there with automated machines. These machines were expensive but produced far more, for far cheaper over time and this resulted in thousands being put out of jobs. In response to this a group of individuals under the leadership of a Captain Ludd broke into factories and destroyed the machines that put them out of jobs. These people were called Luddites and they were eventually tried on charges that equate in the modern day to terrorism. Now over 100 years later, Luddites have emerged, not necessarily in name, but certainly in practice.
Violence against the robots is on the rise and this is proving problematic not only for the creators of these robots but also the general public at large. For the creators, the rising violence is scaring off consumers from investing in new robots. Take for example in San Francisco, where a pet shelter bought a security robot to reduce the number of break-ins. The result was the robot being kidnapped and having barbeque sauce smeared all over it. What is worse however was the rise in threats towards the pet shelter when the news of the kidnapping came out. Hundreds of messages that encouraged violence and vandalism towards a pet shelter simply because they had a security robot! It finally pushed the president of the pet shelter to return the robot to the manufacturer. This isn’t the only example of Knightscope’s security robots being targeted by random members of the public, as they have received reports on everything from childish vandalism of their robots to a drunken Californian man tackling the robot to the floor and taking it out of commission.
This string of occurrences, while disconcerting, highlights the socio-economic fears that humans hold about robots, which is understandable from a certain perspective as these robots were replacing the job of a security guard. Yet troublingly, this is not the only type of violence towards robots. In 2014, Canadian researchers created a hitchhiking robot and set it out across the American continent as a social experiment. The robot reached the border perfectly fine but when it crossed into the United States it quickly got dismembered and left on the side of the road. This highlights a disturbing hostility that humanity has for robots and it has left social scientists and technology experts wondering why humans are so violent to robots.
As mentioned earlier, one of the most understandable arguments is the socio-economic consideration of robots. Tech employment scholars argue that in roughly two decades 47% of all jobs in the US will be automated as everything from manufacturing jobs to truckers, warehouse workers to tax preparers and journalists to doctors are all threatened by the robotic tide of workers willing to do these jobs with far greater efficiency than humans and costing far less for companies in the long term. These changes are already occurring right now with Uber developing self driving cars, Amazon using robots to move packages around their warehouses and, closer to home, Jaguar Land Rover having large portions of their car manufacturing being done by machines. These aren’t far off ideas about the future, these changes are occurring now and robots are presenting the greatest economic threat to the human workforce since the Great Depression (the unemployment rate then was 24.9% in the US). It is no wonder that people, especially those in jobs easily replaced by robots feel threatened by them and therefore lash out at robots anywhere they can. We are soon hurtling towards a future where job advertisements will state ‘Humans need not apply’.
However there is another theory to explain the rising violence towards robots. We are programmed to be violent towards them. One of the holdovers from Luddite ideology is that we hit machines all the time. Think about when your phone doesn’t work, or the printer doesn’t print or your car breaks down. We hit the machine much like we’d hit a human if they did something to annoy us but unlike humans there is no repercussions to hitting a machine so its more acceptable and we do it more often. So does it being a robot make it any different to hitting and attacking any other machine, or is there something fundamentally different about robots as opposed to other technology?
In short, should robots have rights? This would be a solution to human on robot violence as it would ensure robot workers and specifically AI robots would be guaranteed under the same protections as humans. This would therefore reduce the rising violence as people could then be prosecuted for say manslaughter, or robot-slaughter, when a robot is destroyed. However, this decision would generate significant backlash in political communities as well as religious communities across the world as it would be essentially stating that robots are on par with humans and for many that would be a step too far.
There is a different solution to reducing robot violence, that has been shown to work in studies, and relies on, ironically, on our own genetic programming. The solution: make robots look more human. Studies have shown that if you put eyes on a donation box, people will donate more simply because the eyes make us biological kinder. The robots, mentioned earlier, that were attacked, looked nothing like a human and did not have ‘eyes’ necessarily and perhaps that is where the creators have gone wrong. Although the robots might be more inefficient looking like a human, it would provide a cheaper and quicker alternative to reducing the rising violence against robots.
Hopefully humanity will finally be able to escape the shadow of the Luddites and take steps towards the new frontier for robotics. A future where humanity’s greatest problems, such as insufficient healthcare, food shortages and climate change, can be solved by harnessing the power of robots for good and without prejudice.
George Bailey is the Science and Technology editor of the Warwick Globalist and a third year history student currently studying at Northern Arizona University