5 6 7 8 9
STM317
STM317 SuperDork
3/22/18 4:56 a.m.
pinchvalve said:

If you haven’t seen the video, the Uber driver was paying attention on her cellphone and not watching the road.  That said, I cant see how it would have made a difference.  The victim wore dark cloths, was crossing in the middle of an active lane of traffic,  nowhere near a crossing or intersection, and she was n the pitch black between pools of light created by streetlights. Nothing reflected from her or her bike either.  She just appeared out of nowhere, no time to stop or swerve.

Odd that the autonomous systems didn’t see her, Cadillac’s night vision would have seen her clearly.  Maybe the Uber LIDAR can’t see at night?

The backup driver was clearly distracted by something, but Uber's test drivers reportedly keep track of certain data on laptops inside the vehicles, so it may have been that she was doing part of her job, rather than looking at her phone.  (The fact that her job might require her to be distracted behind the wheel is another issue.) The number of things that went wrong here is high, but I'm not sure that situations like this would be that uncommon in the real world. This article gives a decent recap.

-Local government failed. The victim crossed outside of a crosswalk, in an area marked with signs that tell people not to cross there, so it must be common for people to cross there or there'd be no need for signs. But if you don't want people crossing there, how do you leave a brick path in the median so that people can still cross there, and nothing like a wall or hedgerow that would keep them from using this path?

-The vehicle failed. The whole point of having RADAR and LIDAR on these things is to be able to detect objects in adverse conditions before a human might be able to see. This woman with her bike was roughly the size of a couch, moving perpendicular to the flow of traffic, and had walked across multiple lanes by the time the collision occurred. And the vehicle's system failed to see this large object and react in anyway before the collision. Are we sure this tech is mature enough to be used on public roads?

-The backup driver failed. She was obviously distracted, but seeing the video it's pretty obvious that there wouldn't have been much that she could've done to prevent the collision.

-The victim failed to heed warnings and cross at the appropriate location, or look/listen up the road to see if a vehicle might be coming.

It's just sad. A proponent of autonomous tech can point to this and say "Look how many things had to go wrong for this to occur!" and be somewhat justified. A detractor of autonomous tech could point to this and say "Yeah, but how often does stuff not go according to plan in the real world?" and be somewhat justified.

oldopelguy
oldopelguy UltraDork
3/22/18 6:27 a.m.

So I don't know about the Volvo or Uber tech on this particular vehicle, but don't the sensors on most of the autonomous vehicles "see" into the infrared some?

There is probably a market for a bicycle or jogging light specifically designed to be obnoxiously detectable by these cars without being crazy annoying to people around you, kinda like the strobes the military can use to track a particular vehicle through night vision goggles. With LED costs coming down every day, a tailored frequency light that would make a jogger seem more obvious to the AI of a car probably wouldn't even be expensive.

Edit: l wonder if a person could just run a "don't hit me" app on their phone that would put their GPS information out in a way that the autonomous cars could access? Basically put yourself out there the same way one car talks to another? I wonder if that car to car communication has a standard yet, so an uber car can pack with a tesla?

Adrian_Thompson
Adrian_Thompson MegaDork
3/22/18 6:53 a.m.

OK, I have been, and continue to be, a proponent for autonomous cars.  But unfortunate seeing that video, in my (personal) opinion, the car failed.  The huge advantage of these cars is that lack of visible light shouldn't be an issue.  RADAR and LIDAR don't use the visible spectrum so the car should (again in my opinion) have seen the person.  Jay walking should not be an issue for autonomous cars.  Once away from city centers official crossings are less frequent and people can legal cross anywhere.  The fact that an attentive human may not have been able to avoid the situation doesn't  excuse the car.  Sad, very sad.

 

Armitage
Armitage HalfDork
3/22/18 7:03 a.m.

I am in agreement here. Regardless of the pedestrian's poor decisions, the car should have seen them even if human eyes wouldn't have been able to. I do not know how the Uber hardware compares to other competing products but perhaps they need to add additional sensors. I'm still of the opinion that when fully mature, AVs will be a boon to mankind but it's clear we're just not there yet. All progress comes at a cost be it things we now take for granted like a trans-Atlantic flight or things we aspire to like sending humans to Mars. Just because they're difficult problems to solve doesn't mean we shouldn't still try and solve them.

alfadriver
alfadriver MegaDork
3/22/18 7:09 a.m.
Adrian_Thompson said:

OK, I have been, and continue to be, a proponent for autonomous cars.  But unfortunate seeing that video, in my (personal) opinion, the car failed.  The huge advantage of these cars is that lack of visible light shouldn't be an issue.  RADAR and LIDAR don't use the visible spectrum so the car should (again in my opinion) have seen the person.  Jay walking should not be an issue for autonomous cars.  Once away from city centers official crossings are less frequent and people can legal cross anywhere.  The fact that an attentive human may not have been able to avoid the situation doesn't  excuse the car.  Sad, very sad.

 

So how much safer does the autonomous vehicle have to be so that corporations and industries are willing to take up the responsibility of these accidents? 

Given how right now, the reasonability clearly is on the shoulders of the driver.  AND that the system should have the ability to detect objects outside of human perception (which this one didn't do)?

If a company is telling the owner of the car that they don't need to pay attention, then the industry is accepting responsibility- which is a rather enormous price.

Adrian_Thompson
Adrian_Thompson MegaDork
3/22/18 7:16 a.m.
alfadriver said:

So how much safer does the autonomous vehicle have to be so that corporations and industries are willing to take up the responsibility of these accidents? 

 

I have no idea!  But that's why I'm not an Executive at either a tech firm or Auto OEM!!

alfadriver
alfadriver MegaDork
3/22/18 7:44 a.m.
Adrian_Thompson said:
alfadriver said:

So how much safer does the autonomous vehicle have to be so that corporations and industries are willing to take up the responsibility of these accidents? 

 

I have no idea!  But that's why I'm not an Executive at either a tech firm or Auto OEM!!

I don't think the board of directors for these companies are that risk accepting.  Right now, the shareholders are going to go gaga for the start ups due to the "potential"- but once one of them face a rather harsh lawsuit, things will change fast.  Potential becomes responsibility, and the sexy stocks don't look so great.

Adrian_Thompson
Adrian_Thompson MegaDork
3/22/18 7:49 a.m.
alfadriver said:

I don't think the board of directors for these companies are that risk accepting.  Right now, the shareholders are going to go gaga for the start ups due to the "potential"- but once one of them face a rather harsh lawsuit, things will change fast.  Potential becomes responsibility, and the sexy stocks don't look so great.

But then we have everyone favorite real life Bond Villain, Elon Musk.  Tesla seem to play fast and loose with liability far more than most companies.

STM317
STM317 SuperDork
3/22/18 8:10 a.m.
alfadriver said:

I don't think the board of directors for these companies are that risk accepting.  Right now, the shareholders are going to go gaga for the start ups due to the "potential"- but once one of them face a rather harsh lawsuit, things will change fast.  Potential becomes responsibility, and the sexy stocks don't look so great.

Back in 2015, Volvo's CEO stated that they'd take responsibility if one of their vehicles caused an accident while in autonomous mode. This was a Volvo, but was is their autonomous tech or Uber's? Legally speaking, an argument could be made that the vehicle didn't cause this accident, but it certainly did nothing to prevent an imminent collision either.

Also, in a very timely coincidence, a couple of weeks ago Arizona's Governor issued an Executive Order to update state law so that a corporation operating autonomous tech would be considered responsible in the event that one of their vehicles negligently killed someone. Lawyers get to have fun with that specific wording now.

GameboyRMH
GameboyRMH MegaDork
3/22/18 8:19 a.m.

After seeing the video, yeah the car clearly failed to detect the woman in the ~2 seconds where she was visible before impact:

Although I'd bet most human drivers would've hit her as well, although not as hard - most would've at least touched the brakes in that time. A really good driver might've been able to avoid her or at least limit themselves to clipping the rear of her bicycle.

klb67
klb67 Reader
3/22/18 8:26 a.m.

I'm glad they released this video.  I had ideas what happened based on the description, but the video both confirms undisputable facts as well as confirms what we still don't know.  I'm not going to blame the driver yet, as they may have been monitoring data on the car as they were instructed to do.  Or they may have been checking facebook.  I'm sure Uber will find out.  

I agree with Keith - based on the lighting, I can't see how a human driver could have reacted differently - the bike and person just appear.

I also would expect that the Uber car should have "seen" the bike and person.  I had assumed they just stepped out into traffic based on the earlier reports, but the video confirms they were out in the open and darkness was the only reason a human wouldn't see them.  I'd like to hear Uber's conclusions after the investigation is completed but I'm not expecting that to be public.  

And obviously the pedestrian either didn't pay attention, or wrongfully assumed that the oncoming vehicle would see them and stop. 

kb58
kb58 SuperDork
3/22/18 8:37 a.m.
STM317 said:
Are we sure this tech is mature enough to be used on public roads?

 

If only perfect systems are allowed on public streets, why do you have a driver's license?

Type Q
Type Q SuperDork
3/22/18 8:37 a.m.
Adrian_Thompson said:
alfadriver said:

I don't think the board of directors for these companies are that risk accepting.  Right now, the shareholders are going to go gaga for the start ups due to the "potential"- but once one of them face a rather harsh lawsuit, things will change fast.  Potential becomes responsibility, and the sexy stocks don't look so great.

But then we have everyone favorite real life Bond Villain, Elon Musk.  Tesla seem to play fast and loose with liability far more than most companies.

I would caution people not to judge the technology and the intentions of all those developing it by the attitudes and actions of Uber or Tesla.  Neither is known for being particularly honorable in the way they conduct business. There is a lot more caution and concern among the lower profile actors in this space.  

Rodan
Rodan HalfDork
3/22/18 8:40 a.m.
klb67 said:

based on the lighting, I can't see how a human driver could have reacted differently - the bike and person just appear.

 

This is the only thing I wonder about...  first, video cameras generally do not have the sensitivity that the human eye does in low light.  With the existing lighting conditions, an alert human driver may have had more time to react than the video seems to indicate.  Second, with the sharp lighting cutoff, it appears the vehicle's low beams were on.  An alert human driver may have activated the high beams through this section of road creating a larger light well and increasing the distance at which the pedestrian was illuminated.  With the existing evidence, we'll probably never know, but I'm not convinced this was completely 'unavoidable'.

The one thing that is clear, is that the technology is supposed to be designed to prevent exactly this kind of collision, and it failed.

DWNSHFT
DWNSHFT Dork
3/22/18 8:42 a.m.

I'm not a big fan of quoting "experts" but this has some interesting comments.

 

Experts say self-driving Uber should've spotted pedestrian in deadly crash

kb58
kb58 SuperDork
3/22/18 8:43 a.m.

What I don't get is if they use LIDAR and radar, what difference does darkness matter? LIDAR is range-finding using laser light, and that combined with radar should have seen both her closing speed and heading.

Hmm, I wonder if her black clothing absorbed the laser light such that the sensor couldn't produce a reading? That still leaves the radar though and assuming redundant and overlapping systems, it should have been able to still handle it, darkness or not. Will be interesting to see how the investigation goes.

camaroz1985
camaroz1985 HalfDork
3/22/18 8:52 a.m.

The autonomous tech is Uber's bolted to a Volvo.  The same way Waymo bolts theirs to a Chrysler.  In a recent Adam Carolla Show podcast where Adam interviews Waymo CEO John Krafcik (Oddly timed a few days before this incident.  Would have been interesting to see what Adam asked him after this incident), they say that they add their tech to the Chrysler, with the conversion completed at Roush, and that they are exploring partnerships with other manufacturers as well.

I did forget to account for Uber's questionable business and ethical practices in the past, and given the video evidence, their tech is most likely not ready for the road, and especially not at the hands of this particular driver.  LIDAR and radar both work in the dark and should have seen the pedestrian long before the headlights.

I don't think this questions AV tech, but does bring into question the vetting process for who should be allowed to test on public streets.  I don't know what Arizona requires for you to be able to do this (well I'm sure there is a nice check that has to be written, but other than that I don't know).  I'm glad the transportation officials aren't jumping the gun and stopping all testing.  Again I have no evidence of this, but I would have to assume that Waymo, and Cruise Automation (now owned by GM), have better tech that is working correctly.  If you have any interest in this, watch the Waymo video that shows the in car displays that passengers can watch what the car "sees".  I don't know if Uber has any such telemetry, but that would go a long way to proving their tech is ready for the road.

Video from Uber (WARNING, does show the crash, so if you don't want to see that, DON'T WATCH!!)

https://www.youtube.com/watch?v=sFrZLVjueXo

Video showing Waymo's tech, 360 degree video of what the car sees and you can see what the occupants see.

https://www.youtube.com/watch?v=B8R148hFxPw

Yes, some of the video is combinations of real video and CG images.  That is explained here.

https://medium.com/waymo/recreating-the-self-driving-experience-the-making-of-the-waymo-360-video-37a80466af49

And another that shows how the sensor data can be read and displayed.

https://www.youtube.com/watch?v=fbWeKhAPMig

There is a lot of information to read if you are truly interested in this tech.  Our "journalists" would do themselves a favor in getting acquainted with this before they report their BREAKING NEWS.

STM317
STM317 SuperDork
3/22/18 8:54 a.m.
kb58 said:
STM317 said:
Are we sure this tech is mature enough to be used on public roads?

 

If only perfect systems are allowed on the street, why do you have a driver's license?

I've never said that I expect it to be perfect to be on public streets. My only expectation is that there should be a reasonable belief that it be safer than human drivers. If it's not, then it should still be tested on closed courses rather than public streets until such time as it is safer than a person. At this point, there's insufficient data available to the general public on the number of miles driven autonomously to determine if they are or are not safer than humans. The companies that will profit from these things have the data, but keep it very secure for "competitive reasons".  What we do know, is that human drivers die at a rate of about 1.2 per 100 million miles driven. So far, autonomous tech is directly linked to 2 deaths, and they've driven something like 100-120million miles if the press releases that the companies send out can be believed. If they've actually driven much more than that, then their fatality rate should be better than humans, but if they haven't, then they're actually more risky than human drivers and have no business on public roads AT THIS TIME.

What now appears evident, is that this thing couldn't detect something over 5 ft tall and 5 feet wide, and moving directly in it's path. The distracted human driver at least had time to react (as evidenced by the "Oh E36 M3" face), but the vehicle did nothing. It actually did worse than the inattentive meatsack behind the wheel. How will it react if a child darts into the street after a ball? What if a vehicle had been stalled in the road rather than a human walking a bicycle? If it can't do better than a human would in the same scenario, then why does it exist?

ztnedman1
ztnedman1 New Reader
3/22/18 8:54 a.m.

The argument on if a human would or would not have had time to react is TOTALLY irrelevant.

 

This is one of the EXACT scenarios that autonomous car have been hyped to be needed for.  Not only did it fail...it clearly did not recognize the situation as it did nothing to stop.  Even the unattentive "monitor" had time to get the o E36 M3 face before hitting her...yet the car couldn't react?  BS, the car just saw everything and just didn't  see it as an issue, so continued on it's course.

 

There is no justification here for autonomous fans, period.  This is the whole point of this technology and it failed spectacularly.  This is a black eye and a reality check.

GameboyRMH
GameboyRMH MegaDork
3/22/18 9:02 a.m.
kb58 said:

What I don't get is if they use LIDAR and radar, what difference does darkness matter? LIDAR is range-finding using laser light*, and that combined with radar should have seen both her closing speed and heading. Be interesting to hear how it all went wrong, as both the above "should" be better than any human driver.

Yeah the lidar and radar systems should've seen her earlier than the optical systems, assuming that she wasn't obscured by the crest of the hill at that location that was mentioned earlier. But even the fact that the optical system didn't pick her up is a major failure.

The0retical
The0retical UltraDork
3/22/18 9:20 a.m.

Wow the video completely contradicts what was initially reported by the sheriffs office. I was under the assumption from the information that the pedestrian was hit on the right side of the car having come off the curb as the car approached. This clearly wasn't the case.

The optical sensor argument I can buy. The optical cameras ISO may have been adjusting to the street light which created a severe dark or blind spot in the optical object identification. So I don't buy the "Uber doctored the video" conspiracy theories. The US legalizing automatic high beams would be a step towards fixing this.

(HIGHLY SPECULATIVE)

As for why the LIDAR and radar failed to identify, it's probably a 2 of the 3 agree situation like navigational sensors on aircraft. One of them may not have been working correctly, or had a flaw in it's identification algorithm, and agreed with the optical camera that nothing was there. The system would have then ignored the sensor identifying the woman who was struck. It's unlikely that the car has g sensors to realize it struck something so the car didn't stop.

(/HIGHLY SPECULATIVE)

I'm interested to see who takes responsibility for this. If they pin it on the driver there's going to be a huge backlash from the people hired to monitor these things. If they pin it on the company it'll be a wrist slap and the public will feign outrage until the next news worthy tragedy.

Armitage
Armitage HalfDork
3/22/18 9:32 a.m.
camaroz1985 said:

I did forget to account for Uber's questionable business and ethical practices in the past, and given the video evidence, their tech is most likely not ready for the road, and especially not at the hands of this particular driver.  LIDAR and radar both work in the dark and should have seen the pedestrian long before the headlights.

Given their history of unethical behavior, I sure hope we don't later find that Uber doctored the video before releasing it to the public. It does seem strange how the woman just materalizes out of the darkness. Doing so would justifiably hurt the AV industry more than them just admitting to a technical malfunction.

Kreb
Kreb UberDork
3/22/18 9:33 a.m.

Basically technology failed to save someone from their own stupidity and/or inattention. How is this news?

WilD
WilD Dork
3/22/18 9:49 a.m.
Boost_Crazy wrote:

The car failed. I'll repeat that to let it sink in. The car failed.

I agree with you 100%.  I watched the video and this was a complete failure of Uber's technology.  The machine should have seen the pedestrian and reacted in some way before the impact.  As you say, it does not matter if this was a failure to detect or a failure to act on the detection.  This was a failure of the technology, and I would argue that at least in Uber's case, these cars are not safe for testing in public.

This is an interesting case of engineering ethics.  Submitting the public to your testing with now fatal results is not a good look regardless of circumstances and I would argue this may be unethical at face value. It's also worth noting that the robot was exceeding the legal speed limit.  Shouldn't these things be programmed to obey traffic laws?  Would this woman possibly be alive today if it had obeyed the maximum speed set for that section of road?  Speed limits exist for reasons other than how fast a car can go, and one of those reasons is the safety of other road users.  I predict this one is going in the ethics books.  I do not like what I see so far...      

Keith Tanner
Keith Tanner MegaDork
3/22/18 9:58 a.m.

About the speed - it appears that the speed limit on that road may have recently changed and GIS data was out of date.  The car was going 3 mph over the current limit but 7 mph under the old.

The overspeed is not a factor in what happened here, so don't get distracted by it. The result would have been the same at 35 mph. 

5 6 7 8 9

You'll need to log in to post.

Our Preferred Partners
bLgNlsRHWShLXQtsNeUZdtCX9Fv5cG1IBsHIlBfFnwcroqTJAvvmULNB3IMwLG06