2 3 4 5 6
iceracer
iceracer UltimaDork
3/20/18 5:51 p.m.

Problem is, will the public ever endorse them.

Right now, opinion polls are hugely against them.

STM317
STM317 SuperDork
3/20/18 6:22 p.m.

In reply to Keith Tanner :

They come out when:

1- there's adequate data to show that they're comparable to human drivers at a minimum. Maybe the Feds already have access to all of that data but in my lowly searches, it's much more difficult to compile that data than it was for human safety per mile. If anybody else can dig up comprehensive data regarding safety per mile driven of autonomous tech I'd be very interested in seeing it.

2- there's a federal legal framework for operation, liability, etc that consistently applies to all of the manufacturers. This should also stipulate passing a series of tests that show the vehicle can function as claimed on public roads, similar to crash tests or emissions tests. An agreed upon communication protocol is going to be needed at some point too, and the sooner that's worked out, the better. For what it's worth, many of the car companies are also calling for legislation of autonomous driving tech so that they may plan accordingly and avoid spending frivolously. The current situation is a bit of a wild west free for all.

 

Striving to save lives is great. But until there's adequate data from adequate, impartial testing, how do we know that putting them on the roads isn't increasing risk to human drivers?

Kreb
Kreb UberDork
3/20/18 6:33 p.m.

You simply cannot compare the total miles logged apples to apples. For starters, most "self-driving" cars still have a human there to override as necessary. Secondly, A great many of the "self-driving" miles are being logged in some of the most car friendly places on earth - like Arizona. Stick  those same cars into San Francisco or a Boston winter day and they'd be resolutely berkeleyed.  

These cars are inevitable, but I think that we need to be realistic about their current capabilities and not listen to either the hype providers or the chicken littles.

 

Keith Tanner
Keith Tanner MegaDork
3/20/18 6:47 p.m.

We can play the "how do we know?" game forever, because the answer is "we don't until we try". Lab testing will never give realistic deaths per mile numbers because it's in an artificial environment. Even the current testing of autonomous vehicles in the real world is in carefully chosen scenarios as they build up vehicle capabilities, so they're going to be impossible to properly compare to the Full Random World. Honestly, if we trained humans as carefully as we're training autonomous cars, there would be a weaker case for needing them in the first place! Train them for a few years in a playground, then move to slow surface streets with a low speed limit and a vigilant passenger ready to jump in at any time, then start increasing speeds still with the passenger in place. Oh, and they all get together every night for a full analysis of what each of them learned each night.

Tesla probably has the most interesting data already, even if it does come from a bit of a cowboy approach. They have over 4 billion miles of instrumented drive time in all conditions, which makes for a pretty spectacular data set. 100 million miles with Autopilot running. Waymo should be at about 4 million about now. Yes, these are all numbers pulled from press releases but it's an indication of the number of miles being racked up - and every single fender bender is worldwide news.

Automakers test cars with unapproved emissions systems and without full crash safety on public roads all the time. They're a pretty small percentage of the overall fleet and they're instrumented out the wazoo. Sometimes they're being driven by pros, sometimes they're being driven by people whose job is simply to rack up many, many miles in the real world. There's a legal framework to allow this too.

The problem is that humans are just not wired to evaluate risk well. People see one story about someone being killed by an autonomous car that may or may not have been avoidable by a human driver and they decide robot cars are too dangerous and not ready - even though an outside party who has seen the video says that it couldn't have been prevented by a human. They read the articles about people being killed on the roads and decide it's all because the driver was (insert reason) and of course that would never apply to them because they're far better drivers than average. Spent 10 minutes watching traffic at an intersection and you'll see just how false that is.

Dirtydog
Dirtydog HalfDork
3/20/18 6:57 p.m.

I guess if brainiacs and their computers can launch a car into space, then we mortals are ready for driverless cars.  My problem is, I can't afford the chauffer.

T.J.
T.J. MegaDork
3/20/18 8:27 p.m.
RevRico said:

Autonomous cars still won't fix idiot pedestrians.

 

In this case an autonomous car did just that it seems. She won't ever step out into traffic ever again.  Sad way to go though. 

Vigo
Vigo UltimaDork
3/20/18 11:50 p.m.

 Autonomous cars have NEVER been the low hanging fruit of automotive safety. The economic framework that makes them viable is symptomatic of many other 'ills' in our society. Safety is not the primary motive here, it is the sales pitch. Safety will be one side effect of 'buying in', but ultimately they are spending so that your buying won't be a choice. Reduction in human autonomy and an increase in socioeconomic stratification are some other side effects (smaller print). Big Brother used to refer to government, but we can all agree that government-run anything is too inefficient, right? In light of modern developments, it's time to update the term Ride or Die!

Uber, you are my Rideshare or Die!  Bro...

T.J.
T.J. MegaDork
3/21/18 12:23 a.m.

I dislike the term autonomous cars to describe self driving cars. The cars may be autonomous, but they remove pur ability to autonomous. They will drive us how they are programed. When all cars are computer driven, it is easy to envision that they will all travel the same speed so no need for a car that can accelerate or handle better than any other since the lowest common denominator will control all of them.  There will be no car chases because the cars will just pull over when told to by the po po. These things will take away some of our freedom. The technology is cool though. 

Boost_Crazy
Boost_Crazy HalfDork
3/21/18 12:59 a.m.

Like pretty much anything in the news, it's too early and there is too little info released to come to any sort of conclusion. They need to investigate this like any other accident and determine fault, then go from there.

But I did find one bit interesting, if reported accurately...

"The vehicle was traveling northbound ... when a female walking outside of the crosswalk crossed the road from west to east when she was struck by the Uber vehicle," police said in a statement.

That would mean that the pedestrian would have had to cross over the opposing lane of traffic first, and would not have had as much of an opportunity to surprise a driver, much less an all seeing autonomous car, which didn't even brake. It wouldn't mean that the car is at fault, but it would question that it was working as intended. 

T.J.
T.J. MegaDork
3/21/18 7:04 a.m.

In reply to Boost_Crazy :

Do we know that it wasn't a one way street so there were no oncoming lanes? I don't know, but it is possible. 

Kreb
Kreb UberDork
3/21/18 7:35 a.m.
pinchvalve
pinchvalve MegaDork
3/21/18 8:16 a.m.

The impact was on the right side of the vehicle, indicating that the pedestrian was almost all the way across the road before she was struck.  It looks like there was a bike lane beginning there, perhaps the vehicle mis-identified her as a cyclist traveling  in the bike lane?  But that doesn't explain all of the other systems failing.

My guess is that she walked across all the lanes and was stopped to get onto her bike and begin pedaling down the bike lane.  The Uber came around the bend, identified her as a biker traveling in the lane and decided to go on past.  As she started to pedal, she fell off the bike in front of the car.  

This is yet another grim reminder that bicycles belong in BMX parks, not on the nations highways.   

camaroz1985
camaroz1985 HalfDork
3/21/18 8:19 a.m.

I sense sarcasm from @1988RedT2 now, but as I was reading through this thread I was thinking he was graduating to tin foil hat territory.

We are going to know far more about the circumstances of this accident than we would with any other accident, so we will know the true cause.  If this was a human driven car without all the tech of this car, we would only have the driver recollection, and anything that could be learned from investigating the scene.  In this case we have video, telemetry, radar, etc.  Also given the high profile of this case, you can be sure that every byte of information will be evaluated.

To think that autonomous vehicles will eliminate all traffic deaths is unrealistic.  If it is any improvement over human driven vehicles that is a step in the right direction.

Now to give conspiracy theorists another thing to worry about, with all these sensor equipped cars roaming the streets there may also be a decrease in other crime as well.  The vehicles will see everything that happens, so they would see a person attacking someone on the sidewalk, running out of a store, etc.  Before anyone goes down the "its a slippery slope to no privacy" route, I would say there are two things to consider, A) if you don't commit the crime you have nothing to worry about and B) if you are doing something on a sidewalk or city street your assumption of privacy is already void.

rslifkin
rslifkin SuperDork
3/21/18 8:22 a.m.
pinchvalve said:

This is yet another grim reminder that bicycles belong in BMX parks, not on the nations highways.   

Um... No.  Bicycles have just as much right to the road as cars do.  The issue is the number of cyclists that do downright stupid stuff. 

I take the attitude of: don't ride on roads that are blatantly unsafe for bikes.  Stay out of the way and let cars pass when possible.  For intersections with a turn lane or other times where cars can't safely pass you or would push you into a path you don't want to take, wait until it's safe and then move into the middle of the lane (perfectly legal) and just be a car for a moment until it's safe to move over and allow passing again. 

Cyclists that try to go straight through an intersection from the right edge of the right turn lane are dangerous.  I'd much rather see them move into the straight lane when the turn lane starts (as that puts them in a position where they're going where you'd expect, instead of leaving you wondering or not seeing them until the last second because they didn't turn as expected).  

Armitage
Armitage HalfDork
3/21/18 8:25 a.m.

Some new info: https://www.sfchronicle.com/business/article/Exclusive-Tempe-police-chief-says-early-probe-12765481.php

“The driver said it was like a flash, the person walked out in front of them,” said Sylvia Moir, police chief in Tempe, Ariz., the location for the first pedestrian fatality involving a self-driving car. “His first alert to the collision was the sound of the collision.”

From viewing the videos, “it’s very clear it would have been difficult to avoid this collision in any kind of mode (autonomous or human-driven) based on how she came from the shadows right into the roadway,” Moir said. The police have not released the videos.
More on Uber

The incident happened within perhaps 100 yards of a crosswalk, Moir said. “It is dangerous to cross roadways in the evening hour when well-illuminated, managed crosswalks are available,” she said.

edit: appears this info isn't new afterall.

STM317
STM317 SuperDork
3/21/18 8:34 a.m.

In reply to Keith Tanner :

There's a difference between testing a vehicle with unapproved emissions components or crash safety tech vs testing the controls of a 4500lb sled that can crash into things when something goes wrong. A diesel truck testing ECM calibrations isn't an immediate threat to the safety of everyone around if something with the test goes wrong. Those doing the testing are also fully aware of any risks involved, and have presumably agreed to do it. Was the woman killed in this incident fully aware of the risks? Would her behavior have been any different if she'd known it was an autonomous vehicle approaching her?

I understand (and I think most people realize) that there will be tragedies that occur as autonomous vehicles continue to develop. They're going to encounter situations that the vehicles aren't programed to deal with and people will be hurt as a result. But deciding when they've developed "enough" to be turned loose, or when they can no longer be developed further without public testing, is tough to determine. Right now, the only ones determining when the tech is "good enough" for public testing, are the companies with huge financial incentive to bring the tech to market, and that gives me pause.

Uber isn't dumping millions into this to rush the tech to market for safety's sake. They're doing it to eliminate their need to pay a meatsack behind the wheel and improve their bottom line. The sooner the tech is functional, the sooner they become a viable company. Google didn't say "Enough is enough. Too many people are dying in cars and we have to do something!" They're doing it because they've pretty much gotten all of the juice that they can from the internet, and they need new markets to dominate. GM's President clearly stated their primary motivation for working on this tech when he said that they see autonomous vehicles as "the biggest business opportunity since the internet."

I guess that's where my biggest beef with the current state of things lies. They're trying to develop a product that people aren't really asking for right now. They'll be marketed for their improved safety, but the real driver is cash. These corporations are in such a tizzy to be the first to market this tech and make untold profits that I worry they might be sacrificing some due diligence in order to throw the product out there and see what happens.

kb58
kb58 SuperDork
3/21/18 8:42 a.m.

Then you won't much like the Netflix documentary "Dirty Money", as it really shows the dirty underbelly of capitalism. The things VW did, just, wow. I agree with your concerns, that the bottom line and shareholders are a very strong bias to bend, break, or just ignore what's right. Unfortunately, that same capitalism is what makes many of our cool products possible.

1988RedT2
1988RedT2 UltimaDork
3/21/18 8:57 a.m.

In reply to STM317 :

Exactly.  The motivation of the companies behind autonomous cars is clear, and I readily accept that.  What riles me is the glassy-eyed technophiles fawning over new tech like a giggly teen fondling a new i-phone, even as the bodies pile up.  We need to march on these companies with torches and pitchforks and demand accountability.  We need to pierce the corporate veil and put some of these CEO's who enjoy playing God behind bars for a long time.

kb58
kb58 SuperDork
3/21/18 9:10 a.m.
1988RedT2 said:

In reply to STM317 :

...We need to march on these companies with torches and pitchforks and demand accountability.  We need to pierce the corporate veil and put some of these CEO's who enjoy playing God behind bars for a long time.

What could these CEOs do to prove that the cars are ready for the street, because you can always say that they aren't perfect. As was posted earlier, once the automated systems have less accidents than people - people are now more of a threat.

Armitage
Armitage HalfDork
3/21/18 9:17 a.m.
1988RedT2 said:

In reply to STM317 :

Exactly.  The motivation of the companies behind autonomous cars is clear, and I readily accept that.  What riles me is the glassy-eyed technophiles fawning over new tech like a giggly teen fondling a new i-phone, even as the bodies pile up.  We need to march on these companies with torches and pitchforks and demand accountability.  We need to pierce the corporate veil and put some of these CEO's who enjoy playing God behind bars for a long time.

That's really not a fair characterization. In the two days we've been discussing this topic, human drivers in this country have killed on average 33 pedestrians and injured twenty four times that number. How many of those have made the national news and sparked a debate about human driver safety? Let's not forget that the ultimate goal here is to see this number decrease with self driving vehicles.

STM317
STM317 SuperDork
3/21/18 9:21 a.m.

In reply to Armitage :

Is that the ultimate goal though?

Adrian_Thompson
Adrian_Thompson MegaDork
3/21/18 9:36 a.m.

First, I'm really sorry for everyone involved in this from the victim and family, to the person in the car and all those involved in developing it. I can't imagine how bad they all feel now, even if there was nothing anyone of them could have done about it.

I was reading every report I could find on this on Monday.  One thing that pissed me off at the time was that every news source (from the reliable to the sensational) originally had images of Uber / Ford Fusions with the headlines.  That pissed me off as the obvious assumption for readers at the time was that a Ford vehicle was involved.  The vast majority of the comments at the time were the predictable ‘robots are evil’ and ‘Charge the CEO”s with vehicular manslaughter’ or ‘you can pry my steering wheel out of my cold dead hands’ type.  I’m glad to see here at least the conversation is a little more rational.

For those fearful of driver directed cars being banned, I very much doubt we here in the US will have to worry about that in our lifetimes. The single biggest difference between the US and other first world developed countries (regardless of political affiliation) is the belief in personal freedom. The (overused and frequently wrong) belief that Government is the root of all issues, and the fight against regulation is so so strong here I really don't see any serious attempt to stop people driving themselves getting any real traction this century. I can totally believe major European and Asian cities banning human directed vehicles, at least in major cities, within the next 30-50 years, but not here.

One of the biggest advantages of autonomous vehicles compared to human directed is that light condition, sun in your eyes etc. shouldn't be an issue as LIDAR and RADAR are using frequencies outside of human vision and not as reliant on visible light spectrums. Please note, people keep talking about snow and things. Remember Ford were the first people to have an autonomous car driving in snow over a year ago now. 

The fact that the woman crossed/entered the road not at a crossing should also not make any difference.  The computers should be locating and tracking all objects at all times.  There isn’t going to be lines of code that say ‘Oh, this person is 10 meters from a crossing so I don’t have to worry about them’  This is not a criticism of the car/system/Uber in this case as no one here knows any details of what exactly happened.  If the internal cameras show the human ‘driver’ was alert, looking forwards with his/her hands on the wheel and ready to take over, it’s going to be hard to ‘blame’ the car.  Sometimes bad and nasty things happen.  Any parent will tell you it’s not even safe staying in bed.  Many times we heard a thump in the night and went into young kids room to find one of them had rolled out of bed in the night.  Once our eldest even hit her lip as she fell off the bed and needed stiches.  Nowhere is 100% safe.

People also say that a human can make anticipate what a pedestrian is going to do from body language. Even so, there is still reaction time which will always be slower than a computer.  If/when a human makes an emergency maneuver the vast majority, including above average drivers, will not be able to take evasive action at the vehicle limits of adhesion. Most people will probably hit the brakes so hard they instantly go into ABS, at the same time as jerking the wheel so far that they will likely exceed the traction understeer etc., or some combination of the above. The other extreme, is they are so scared they way underutilize the traction available. I have personally witnessed both happen. The computer guided car should be able to take the car much closer to its limits without exceeding them reliably.  If people doubt that look at F1 or other top line series.  They all ban ABS, Traction control etc. as it’s known that computers will always hold an advantage over a human brain/feedback system when it comes to holding something right on the very edge.  Think back to the criticism of the 1992 (26 years ago remember) when people said the computers were taking away from the human element which lead to most driver aids being banned.

Finally, once this technology is ready for prime time, it’s not going to be practical for private owner operators, even with the rapidly decreasing costs.  The first full autonomous vehicles are going to be way too expensive for private owners.  Think Ferrari/Lambo money, and I’m not talking about their entry level proletariat models either.  The first customers will be fleets like UBER, Lyft, Amazon, FedEx etc.  They can use them 24/7 other than charging, cleaning and maintenance.  The payback for them will be reliability and not paying drivers.  Also they will most likely be level 4 autonomous not level 5 and will operate in geo-fenced areas like big cities and/or freeway corridors, not outer suburbs or rural areas.

1988RedT2
1988RedT2 UltimaDork
3/21/18 9:38 a.m.

If one driver kills another, we must decide if it was an accident, recklessness, negligence, or an intentional act.  It doesn't so much matter to the victim, but the rest of us should care.  If one particular driver goes around killing people with his car, we would presumably take his license, jail him, and charge him with the crime.

If an autonomous car kills a human, we will presume to call it an accident.  But is it?  If there is a statistical chance of deadly malfunction in the programming, does that qualify as criminal negligence on the part of the programmer, or premeditation?

"Yes, your Honor, I knew that within a million miles or so, my program for a self-driving car would encounter a situation in which a person was killed." 

There is no such thing as good enough.  Comparing the death rate of self-driving cars to that of human drivers is irrelevant.  And for companies to set these killer cars loose on public streets to mow people down during "beta testing" is unconscionable.

GameboyRMH
GameboyRMH MegaDork
3/21/18 9:53 a.m.
1988RedT2 said:

There is no such thing as good enough.  Comparing the death rate of self-driving cars to that of human drivers is irrelevant.  And for companies to set these killer cars loose on public streets to mow people down during "beta testing" is unconscionable.

This is madness. Yes there is such a thing as good enough. We've been through this before with airliners which run on autopilot most of the time these days and have fly-by-wire controls which have contributed to at least one crash that I know of. There are autonomous trains in operation. You could also consider elevators and escalators. Autonomous cars will never stop killing people entirely and that is acceptable.

Remember, autonomous cars have killed 2 people worldwide in total so far. Human-driven cars kill almost 3300 per day.

STM317
STM317 SuperDork
3/21/18 10:04 a.m.

In reply to GameboyRMH :

Not to belabor a point, but looking at total deaths without knowing the rate of incident doesn't really give a clear picture.

2 3 4 5 6

You'll need to log in to post.

Our Preferred Partners
krK4bewkUOSYwJpx6eooNDTtlCrkcGinyrxmtE6kRy7JbWsM2Y3YDcFPxhMS66ay