Looks that everyone wants to shock. There is no dilema. An autonomous drive system should behave like a "perfect" taxi or limousine driver. It only needs to obey driving rules. There are rules which limits the speed in cities, and rules which requires to adapt the velocity to road conditions. Also autonomous vehicle sensors are far better than human vision, this before considering radar and lidar system. Therefore combining these facts it results that the chance of a "proper" autonomous vehicle to be caught in a dilema situation is negligible and in that case it should only apply the law. Just to know a vehicle radar system is actually capable to "see" vehicles and pedestrians which are "masked" by other vehicles long before a human driver would see them, such as a person coming toward the road between two parked cars.
You can never eliminate 100% of risk. There will always be some element of it even in the best designed scenarios. BTW, another reason so many people fear flying (aside from intense media attention) is that passengers don't feel in control --they are not the one flying (driving) the plane. Commercial aviation is by far the safest way to travel yet many are still anxious flyers despite all of the reassuring statistics.
Watch what happens when we have our first fatal AV accident --it is likely to strike terror in the hearts of the public unless there is always a way for a passenger to take control when needed.
We human being seem to fear loss of control much more than the consequences of it.
Claus. Your dreaming like Santa Claus. That's NOT the real world and real situations. It is NOT neglible the situations that arise. I can't figure for the life of me how you can't see this is unreasonable your position. I have had to take action to save lives MANY times in my nearly 50 years of driving. At least a dozen times maybe more. And I dont' drive millions of miles a year. Self autonomous might easily drive millions of miles in their life times so they will see hundreds of these situations at the very least. You must have been raised in a small town with few people and you don't know the reality of far and wide variscapes that exist in for example the mountains--rock slides and snow slides and trucks that lose their brakes, bikers that go to fast and cause accidents, etc, etc, etc, etc, etc. Your dreamin of an unreal world.
Moot point since the "perfect" Autonomous vehicle does not get into those situations. Humans on the other hand typically will not even think of the options as in most cases will endeavor to remain unharmed and blissfully unaware of the possibilities of death. As to the running down the pedestrians at that intersection, well plowing into a bunch of people versus a concrete wall most drivers would not even think about the wall simply try to stop or at least slow down enough to minimize the carnage.
Regardless of the ethics involved, we will be forced to follow the path of least legal risk by the insurance companies.
There is a huge difference between a system that will stay between lines and slow down or stop for vehicles/people stopped in front and one that does "everything else". I've worked with technology enough to watch it fail over and over again doing simple things it has had years of practice doing. I think people who are optimistic about how we will have autonomous cars in a couple years and they will be infallible are in for some major disappointment.
There are so many sensors and redundant sensors that need data collected and analyzed by systems and redundant systems. There is so many factors in decision making, so much difficult image recognition that requires a lot of computing. Managing the sheer amount of complexity of 100% autonomous systems would require way more work and code than modern operating systems. I just don't see a code base that complex ever being free of bugs and some of the strange scenarios I have encountered driving would be really hard to teach an autonomous system to account for.
Brian M
Think the solution to the moral dilemma here is too follow the route of least blame or how a court would see it if judging a human driver.
i.e. If action is taken which is not due to the fault of the driver, then they can't be held responsible for the consequences of action to preserve their own life..
A car driver or AV is not a pilot of a fighter aircraft deciding to make the ultimate sacrifice to steer plane away from a crowed area instead of ejecting and surviving.
The scenarios posed here are a bit dubious anyway, for example swerving into pedestrians to avoid a falling load? The AV would have to be driving too close and at too higher a speed with pedestrian close at hand, so a scenario that should never occur. i.e. Should be able to brake or to swerve clear without serious harm to anyone

Obviously people do not know how to drive once they leave the vehicle. They may do better to write on these topics by doing all their work while riding in a driverless vehicle instead of speculating on how to avoid keeping a safe distance and properly adjusted speed if potential obstacles that could do such things were to register to exist. There's no reason that a computer could not plan a superior safety plan without boggling on this bizarre scenario.
The automobile is a device for individuals to privatize public space. The AV debate you so well cover, suggests to me that it is time for the public to make the decisions about how the devices used on it are programmed. Harm reduction needs to be job-one.
If "vulnerable" road users are at the bottom of the hierarchy, we will never get people to walk, cycle, and even use transit (which requires lots of walking) in numbers sufficient to make our communities convivial nor our climate stable.
While the ethical questions remain, I believe that if AVs really come to pass the question of "would you buy one that...?" will be moot - seems to me that if people just view AVs as a way to get from here to there then what matter does in make as to what the brand or whatever is (much like commercial flying, where passengers never choose flights based on the aircraft making the trip).
After all, what's the main driver (pun intended) for car purchases today? Beyond purely utilitarian desires (fits a family of 6, or whatever) it's the enjoyment of driving the thing - the performance, the feel of the road beneath your seat, etc. Except for the moron class who insists on texting while driving, and they'd be the first to want AV anyway. But once the owner will no longer be the driver, then what will any aspect of the vehicle really matter? We will become a society of passengers, with Uber-like renta-rides available to ferry us about at will, and without the hassle of car ownership.
I really fear for the motorcyclists who enjoy riding; if cars become autonomous then will we be allowed to split lanes? Or even ride at all, since foolhardy humans will never match the precision of AVs and might not be permitted on the same roads.