When machines turn against us

Who is to blame when autonomous machines and vehicles go awry and injure or kill humans? A landmark case in the US could decide how this technology evolves in the future.

Mark Anthony
3 min readJan 24, 2022
Photo by Possessed Photography on Unsplash

Just after midnight on 29 December 2019, a Honda Civic car pulled up to an intersection in the Los Angeles suburb of Gardena. The traffic light was green. As the car proceeded through the intersection, another car exited a freeway, ran through a red light, and crashed into the Honda. Both the driver and the passenger in the Honda Civic were killed instantly.

Tragic though the incident was, the death of two people in a road traffic accident is hardly unusual. But what marks this case as unique is that the car that ran the red light was a Tesla Model S that was being driven on Autopilot mode.

The defendant in the case — limousine driver Kevin George Aziz Riad — is thought to be the first person to be charged with a felony in the US for a fatal crash involving a motorist using a partially automated driving system.

For a time, it appeared that prosecutors might pursue car maker Tesla but it now appears that will not happen. Instead, in what is set to be a landmark case, the driver has been charged with two charges of vehicular manslaughter. He has pleaded not guilty.

All of this came to light in the same week that the organisers of the massive Bauma exhibition in Munich announced that it was dedicating a major multi-lingual seminar on the subject of autonomous machinery for construction and demolition.

The concept of autonomous machines is not new. Autonomous Caterpillar trucks have hauled well over three billion tonnes of material in mines across the world. And they have done so without incident. The notion of autonomous machines was also a key element of my fiction book Demolition 2051: not because I thought the technology might exist one day; but because it already exists right now.

The difference, of course, is that those massive Caterpillar trucks are operating on vast and remote mines where there are no pedestrians and very few people. An autonomous machine operating on a demolition or construction site would not have the luxury of such splendid isolation.

The technology required for machines to conduct — say — a top-down demolition exists right now. And just this week, it was reported that a robotic paving slab cutting process developed by Eurovia UK is nearly ready to be used on site.

Unquestionably, such technology brings with it huge potential benefits. Robots do not require breaks, nor do they take time off sick or for holidays or pandemics. Robots do precisely as they are told, which likely means greater productivity coupled with lower fuel consumption. And, let’s be honest, robots are cool; and a truly autonomous demolition robot would be cooler still.

Photo by Xu Haiwei on Unsplash

But while boffins and engineers push the boundaries of what machines can accomplish without human intervention, they would do well to heed to heed the Three Laws of Robotics cited by science fiction author Isaac Asimov almost eighty years ago.

First Law: A robot may not injure a human being, or, through inaction, allow a human being to come to harm.
Second Law: A robot must obey orders given it by human beings, except where such orders would conflict with the First Law.
Third Law: A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

Until those laws are fully embraced and incorporated, maybe autonomous machines and cars should remain confined to large mines or works of fiction.

Mark Anthony is the founder of DemolitionNews.com.

--

--

Mark Anthony

Mark is a journalist, author, podcaster and daily live-streamer specialising in the field of demolition and construction.