Automotive

Should autonomous vehicles be programmed to break the law?

Should autonomous vehicles be programmed to break the law?
Should autonomous cars be programmed to break the law, and if so, under what circumstances and to what extent?
Should autonomous cars be programmed to break the law, and if so, under what circumstances and to what extent?
View 1 Image
Should autonomous cars be programmed to break the law, and if so, under what circumstances and to what extent?
1/1
Should autonomous cars be programmed to break the law, and if so, under what circumstances and to what extent?

As autonomous vehicles pull out onto the roads in greater numbers, issues of safety and morality will inevitably be raised. Extreme ethical scenarios are being considered, like deciding whether a car will swerve to avoid hitting a pedestrian if it means killing its own occupants, but not all of these questions are a matter of life and death. Stanford researchers are looking into how to ethically program cars to break minor laws if required.

A survey of driverless car safety studied by MIT paints a grim picture, as people generally seem to agree that autonomous cars should be designed to minimize casualties, but most wouldn't buy a car that might kill them for the greater good. Thankfully, these scenarios, while important to plan for, should be rare occurrences.

The project at Stanford is considering more minor ethical issues, which may have less severe consequences but will crop up more often. For example, if an autonomous car approaches an obstacle that takes up half a lane, and there's a double line in the middle of the road, what should the vehicle do? A human driver might not think twice about momentarily breaking the law and passing over the lines to get past – assuming there's no oncoming traffic, obviously – but is it right for autonomous cars to be programmed to plan ahead of time to break the law? And if so, under what circumstances, and to what extent?

"We can treat that as a very harsh, strict constraint, and the vehicle will have to come to a complete stop in order to not hit the obstacle," explains Sarah Thornton, Stanford PhD candidate.

Programming cars to follow the law to the letter might feel like the sensible option at first glance, points out Stanford PhD candidate, Sarah Thornton, but human drivers will often make these minor infractions to maintain vehicle safety and passenger comfort, and stopping in the middle of the road for a wayward cardboard box might not be so safe an option anyway.

Instead, autonomous vehicles may be better off balancing practicality with the need to stay on the right side of the law. The car could choose to cross the lines, but pass as close as possible to the obstacle, minimizing its law-breaking but which may be, as Thornton says, "very uncomfortable for the occupant in the passenger seat." Or, if safe to do so, the car may choose to veer out into the opposite lane to give the obstacle a wide berth. It's these kinds of considerations will need to be examined further.

"The vehicles are going to be what's making the decisions now," says Selina Pan, postdoctoral scholar at Stanford. "And so we need to somehow translate social behavior, ethical behavior, into what happens once the vehicle finally takes full control."

The team demonstrates the different scenarios in the video below.

Source: Stanford University

Stanford researchers discuss the ethics of autonomous vehicles

8 comments
8 comments
Eric the Red
It is legal to cross Double white lines to avoid an obstacle in Australia, of course not if there is oncoming traffic. Not sure a slow moving car is an obstacle, but a lot of people think so.
Grumpyrelic
An interesting question. Every adult has a knife and fork to eat with. The knife can also kill people. On pain of mortal sin, the Catholic church, with a commandment, declared that no practising catholic may eat meat on Friday. They rescinded it. Did all the people who did eat meat in the past go to Hell? In the US, during the oil embargo, the speed limit dropped to 55 MPH. Imagine how much reprogramming would be required if all the cars had been autonomous then and were limited directly by law? Robots, autonomous vehicles, aircraft, ships, etc. are owned and/or built by somebody. These people are responsible. That is why we have courts of law.
Aross
Maybe the simple solution would be to have the vehicle stop but provide a user override to "bypass if safe".
piperTom
Let's say (for the sake of argument) that the purposes of traffic laws are saftey and efficiency. The "program" written by the legislature for these purposes is written in Legalese, a language poorly suited to the purpose. Also, the legislators have only general information about YOUR situation, when you need to make a decision. The program in the car, by contrast, is very well suited to its task and has a wealth of very specific and timely information about the circumstance. Thus the programmed car is very likely to make much superior decisions. Don't get stuck in the stone age of car safety: the car should take note that other drivers MIGHT try to obey the law; other than that, the law is of no use.
FollowTheFacts
...it's not against the law to cross over double-yellow lines...are you stupid... (double-double yellow lines are "forbidden" to cross over though...ignored by everybody, including police...)
Bob Flint
So we are finally seeing the real implications that the "autonomous vehicle" must deal with. More specifically the legislations and current laws of all levels of authority for the safe transport of passengers, & cargo.
Since there are so many conflicting laws on the books, many will need to modified, & new ones created.
Unlikely that we will reach a consensus in the time that we think the technology will be affordable for those that are actually interested.
Case in point the recent Tesla crash even in autopilot mode, had the speed limit been observed, the outcome would have been different.
Username
The laws are there to ensure safety while considering the lowest common denominator(idiots). This is a non issue. The car can be programed to chose the safest option, something humans can be trusted to do. A lot of laws will be rendered obsolete and will need to be replaced by laws governing AI directives.
Eric the Red
From the Australian road rules 2014
139 Exceptions for avoiding obstructions on a road
...
(4) A driver may drive on a dividing strip, or on or over a single continuous line, or 2 parallel continuous lines, along a side of or surrounding a painted island, to avoid an obstruction if: (a) the driver has a clear view of any approaching traffic, and (b) it is necessary and reasonable to drive on the dividing strip or painted island to avoid the obstruction, and (c) the driver can do so safely.