• After 15+ years, we've made a big change: Android Forums is now Early Bird Club. Learn more here.

Autonomous Cars: Not Yet Ready for Prime Time

rootabaga

Android Expert
...or at least not ready for nighttime driving when a pedestrian is not in a crosswalk.

My condolences to the family, friends and loved ones of Elaine Herzberg. :(

https://www.nbcnews.com/tech/innova...r-car-involved-fatal-accident-arizona-n857941


Yes, it may turn out that the car wasn't "technically" at fault in part because Ms. Herzberg was not in a crosswalk, but that's not the real question (nor should it be, as you can always predict people, kids, animals and other vehicles may not act predictably).

The real question is whether an actual driver might have seen Ms. Herzberg and avoided killing her, or at least been able to redirect the car (and slow down) sufficiently that she might not have incurred such serious injuries.
 
Last edited:
I doubt it. I got hit by a car crossing a 4 lane street (2 each direction) by a human driver. IMHO there is no way an autonomous vehicle can predict the stupid things humans sometimes do.
 
We don't know the circumstances of the accident in anything like enough detail to draw a conclusion. Hence I'll withhold judgement until I know more.

But knowing a number of people who have been involved in accidents involving human drivers, including a heavy goods vehicle driver who was a serial offender and a friend being killed a couple of months ago due to 2 drivers failing to pay attention (he was a cyclist who had the misfortune to be passing), I am far from convinced that human drivers, with their ego, impatience and inattention, are ready for prime time.
 
We don't know the circumstances of the accident in anything like enough detail to draw a conclusion. Hence I'll withhold judgement until I know more.

But knowing a number of people who have been involved in accidents involving human drivers, including a heavy goods vehicle driver who was a serial offender and a friend being killed a couple of months ago due to 2 drivers failing to pay attention (he was a cyclist who had the misfortune to be passing), I am far from convinced that human drivers, with their ego, impatience and inattention, are ready for prime time.

Couldn't agree more. And you'll never have the situation of a computer being over the drink-drive or drug-drive limit.
 
I happened to be reading this thread at the same time I was reading our local news. It occurred to me that even with all of the sensationalist style news reporting of ANY autonomous vehicle being in any kind of accident the view of the implications becomes skewed. There are a minimum of 3-4 accidents reported in the daily news here and those only include the ones with any shock value. Given the propensity for humans to make mistakes through a multitude of reasons I would be very curious to see someone crunch the numbers to find out what the ratios of successful driving episodes for humans versus autonomous vehicles would be. I doubt such a study would actually occur due to a high fear factor regarding that we may all find ourselves no longer able to legally drive unless we can do so with the same percentage of success as the AV's.
Anyone remember the Woody Allen movie "Sleeper"? A bit of a futuristic tale and the vehicles were all autonomous except the old Beetle he found in a cave somewhere, of which he couldn't drive it for crap!
 
Hmm, well video has been released by the police (which surprises me). I'd still be cautious about judging based on a video on a small screen, which won't be the same as a live view. But with that caveat my impressions are:

* The victim is really only visible in the last second before the collision, when she stepped in front of the headlights. Now this is the thing I'm really cautious of, since live vision will be different from the video, but if that represented real visibility I doubt many human drivers would have avoided the collision - which doesn't change the fact that the car did not respond at all.

Now I don't know what sensors the car has. An active sensor like radar would not have a problem with the dark, for example. Certainly viewing this my thought was that adding infra-red cameras might be a good idea. And why it didn't react at all remains an unanswered question. But it should be possible to learn from this.

* Obviously I don't want to add to the distress of the victim's family and friends, but she really did just walk across the road in front of the vehicle as if it wasn't there. She didn't just step off the sidewalk, she was coming from the far side of the road at a steady pace. I really don't know what was going on there.

* Perhaps predictably, the driver was not paying full attention to what was going on (a pattern we've seen with accidents involving Tesla's on autopilot). Appears to have been looking at something on her lap and just glancing ahead every now and then to check. It doesn't show her hands.

I am pretty sure Uber will try to put as much of the blame as possible on her, and given that her job was to be there to react to hazards the automatics didn't pick up there would appear to be a case. But psychologically I can understand it: you've probably had many days of having to do nothing while the car just gets on with it, it's late and the road is quiet, so it wouldn't be hard for attention to wander. And I wonder how much video of inattentive safety drivers Uber have seen but not acted upon before there was an actual accident?

Actually the job of safety driver strikes me as taking the fundamental problem of driving - that the consequences of inattention are terrible, but the task itself isn't enough to occupy you fully - and then pushing them to the limit. So IMO it should really only be given to either an engineer, who will be actively occupied is assessing the vehicle (hence less likely to lose attention) or a professional driving instructor (used to looking out for hazards the driver hasn't reacted to), and not just any driver willing to act as a warm body behind the wheel. Because staying engaged for hours with nothing to do is going to be more difficult than it sounds, and sadly you only have to look away for a moment.
 
The victim is really only visible in the last second before the collision, when she stepped in front of the headlights. Now this is the thing I'm really cautious of, since live vision will be different from the video, but if that represented real visibility I doubt many human drivers would have avoided the collision - which doesn't change the fact that the car did not respond at all.

I would expect a car equipped with electronic sensors to be able to react in milliseconds. Well at least far better reaction speed than a human. Some evasive action should have been triggered. I don't think even computer driven cars will completely eliminate accidents, but in this case I think Uber have some work to determine why the car didn't react.
 
Is it possible the car decided it was safer to save its occupant rather than the pedestrian?

Now there's a dilemma for the computer. Does it swerve to avoid person in the road, but in doing so, would hit a solid obstacle, thus putting the driver in harm's way?
 
Now there's a dilemma for the computer. Does it swerve to avoid person in the road, but in doing so, would hit a solid obstacle, thus putting the driver in harm's way?
It is a big ethical question facing the autonomous driving community and I'm sure the courts and lawyers will go a long way to deciding it.
 
Now there's a dilemma for the computer. Does it swerve to avoid person in the road, but in doing so, would hit a solid obstacle, thus putting the driver in harm's way?

IMO, I would say to avoid the person in the road. If the car hits the person there is certain to be extensive harm to that person but if the car instead hits a solid obstacle then, at least in theory, the usual design of the car would protect the occupant from major harm. But I am no judge or jury
 
I was thinking on the way home, as I got stuck in a traffic jam caused simply by people rubber necking at an accident on the other side of the road (something a computer would never be interested in doing).

Anyway, I have no data to back this up, but I've got a feeling that a lot of traffic accidents are caused by people not observing the rules of the road, and just doing plain stupid things. Certainly the case for the accidents I've been involved in. You know, simple things like not observing speed limits, insufficient stopping distance, doing 'U' turns in silly places. All these rules are in the highway code. An autonomous car would observe such rules to perfection.

I'm not saying autonomous cars would eliminate accidents, as this case proves, but I reckon they would sure cut down a lot of them.

Are they as much fun as driving yourself though? That's another question altogether.
 
It is a big ethical question facing the autonomous driving community and I'm sure the courts and lawyers will go a long way to deciding it.
Not to mention marketing. Rationally there will be circumstances in which the correct thing to do will be to endanger the passenger in order to protect a larger number of others, but a car that protects the passenger first and foremost will sell better.

Of course if you think of them as a service rather than something you own yourself then the actual owner may be more inclined to buy a "moral" car (fewer liabilities).

Though I don't think that's what happened here. Although the view is imperfect the road looks empty enough that it seems unlikely that neither steering nor braking was an option. I can think of several possibilities, but all would be speculation.
 
...the view is imperfect the road looks empty enough that it seems unlikely that neither steering nor braking was an option...
I just saw the video finally. The headlights on that thing are terrible or the camera (which I assume is part of it's self driving tech) is terrible. I thought, by description, the pedestrian came from the curb, not the center of the road. If whatever road sensing tech can "see" something in the roadway, albeit in the other lane or center of the road, in motion is scary. If it could "see" it on another sensor, and didn't anticipate or account for the fact that it was in motion is just as scary. At 40 MPH if I had less than 1 second to recognize and react to an obtacle in the road I would hit it too. Where she came from she "should" have been visible sooner than that.
 
I'll chime in here.
There are technologies that would have enabled her to be seen easily and effectively.
Infrared vision should have been able to see her.
A radar system should have been able to see the bicycle.

I don't know how the car is/was equipped but the technology to see that woman in the dark exists.

You cannot effectively use a single vision technology like radar since radar won't see a human.
You cannot simply use a vision system (inferred, lidar, camera) since the effective density of an object can't be detected. A bag in the road or a flat boulder look the same and you better not run into a big flat boulder, where you can run over a bag.

She should not have died.
 
I'll chime in here.
There are technologies that would have enabled her to be seen easily and effectively.
Infrared vision should have been able to see her.
A radar system should have been able to see the bicycle.

I don't know how the car is/was equipped but the technology to see that woman in the dark exists.

You cannot effectively use a single vision technology like radar since radar won't see a human.
You cannot simply use a vision system (inferred, lidar, camera) since the effective density of an object can't be detected. A bag in the road or a flat boulder look the same and you better not run into a big flat boulder, where you can run over a bag.

She should not have died.
I'm not casting judgement, but I tend to agree with you. I have a cheap dashcam, and night vision on it I can see objects on unlit back roads at 40MPH far clearer than that video. I can also see with my naked eye far more than shows up on the camera. There had to be some sensor that should have "seen" her sooner than she appeared on the dash cam.

When I first read that she "appeared just before impact" I thought she stepped out from behind something off the curb. Not that she "came into view" on the dash cam from the center of the road. Any driver or driving system needs to account for anything in the roadway at all times.
 
Now there's a dilemma for the computer. Does it swerve to avoid person in the road, but in doing so, would hit a solid obstacle, thus putting the driver in harm's way?

Classic trolly car problem.
A switch operator observes an out of control trolly.
The operator can derail the trolly and cause injuries/death to those on board the trolly or can let it continue and injure/kill people further up on the tracks.

Who do you allow to die or be injured?
It's a question that autonomous technology will need to answer. Would you buy a car programmed to sacrifice the driver? There are no standards for the software programming and decision that must be made. As technology gets better. You will be able to better calculate limits of traction, maximum turn angle before skidding, etc., etc. To make a decision on which course of action will result in minimal injuries to both driver and people around, But there will be an incident where the decision will be less clean and a choice will have to made on who do you allow to die?

Autonomous technology can reduce injures and save some lives. Ultimately you can't save everyone and those people you save will be different than if a human driver was making the decision. Does that make the decision right or wrong? I don't know, I just pose the question.

I have been doing computer design for a long time and have done design for systems that require fault tolerance and auto safety.
 
One thing I am sure of is that once autonomous cars become mainstream the single biggest hazard will be human drivers sharing the road with them. Not just because they will be less predictable and more error prone, but because a lot of them will act like assholes because they will know that the autonomous vehicle will always defer to them to avoid an accident. And they will cause accidents as well by pushing their luck this way.

On the plus side, once autonomous cars become widespread I can see no reason at all not to simply ban anyone who has proven themselves insufficiently responsible and mature to hold a license.
 
Years ago here in Law Vegas there was a huge issue of people jaywalking on Law Vegas Blvd AKA the world famous law Vegas strip. People were getting pegged left and right. Each time it caused a traffic nightmare. Then they had several people killed including a law enforcement officer from another town. Finally the Clark county government and city of las Vegas finally got together and put up skywalks all major intersections so people could cross safely and put up barriers going down the Blvd preventing people from crossing over on the street. I don't know the cost of the project but I would say easily $30-$50 million was spent. Jaywalking fatalities or incidents on the strip are almost unheard of these days. Of course you have idiots who still try. They 99% of the time get busted by the cops and end up getting a $200 fine for jaywalking
 
my 2 cents...

computer vs human?
a computer would be safer in the long run... and in general.
you cant predict all situations..so there will always be a risk.
but with a computer driver.. the risks are far far less than a human.

if all cars on the road are computer drivin, then accidents will be ALMOST eliminated.
i would predict that most accidents will be with pedestrians or a malfunction.

but accidents will still happen. but way way way less.

in this particular situation .. the article that was posted about.
i doubt a human driver would have had a different outcome.
there was NO time to react.

GUESSING..
the woman was walking on the other side of the road with her bike... moving along the road... then she quickly decided to cross the road.
car radar/sensor saw the woman.. moving along other side of the road. evaluated it as NOT AN ISSUE.
but the computer had NO time to physically change the car's motion when the woman quickly / suddenly attempted to cross the road in front of it.
 
As has been noted, yes the lady was jaywalking and "should have known better." But let's not get hung up on that. Pedestrians always have the right of way, even when they are not in a crosswalk. Again, moreover, people are going to act unpredictably, and human drivers understand that and know to be on the lookout for it. Heck, let's not focus on the lady, let's pretend it was a small child who ran after a ball into the street. Too young to know better, but a person is (hopefully) going to be considering those possibilities. Since a computer really can't (yet) think this way, the sensors and algorithms simply have to be programmed to pick up such things and react immediately.

They could start by applying the brakes to the max.

i see what you are trying to point out... but you a ignoring the major fact:
no one can predict what will happen...
shit happens...
people leave the safety of the sidewalk.. for un-seen reasons.

only a computer can best react to un-seen situations...
and not over compensate or over correct..
make the best evasive maneuver
always on high alert
see things in all weather bettter
see things in all directions

even with the computers advantages...
there are still surprise situations that cant be avoided

you take a million bad driving situations.. (this happens every hour of the day)
and put a computer in vs a human...
99.9% of the time.. the computer will have a better outcome!
humans?.. this is why our insurance are so high!!!
 
Apparently Uber's prototypes use LIDAR, so the darkness should not have been a problem as that doesn't rely on ambient light.

So the question is whether it is a design flaw (maybe some combination of distance, position and size which the system didn't cover or have a response for), an undetected fault, or something like the old classic of a warning system being turned off while some maintenance or testing is done and not turned on again afterwards (i've known apparatus to be trashed that way, and such errors played a part in Chernobyl, so if I were investigating this is the first thing I'd check).

There were definitely three factors here anyway: a pedestrian who seemed oblivious to the vehicle, a safety driver who wasn't sufficiently attentive, and a system which failed either to detect her or to respond if it did.
 
Back
Top Bottom