1 2
GIRTHQUAKE
GIRTHQUAKE HalfDork
12/8/19 6:23 p.m.
ebonyandivory
ebonyandivory PowerDork
12/8/19 6:24 p.m.

Sometimes it's a better choice to collide with a stalled vehicle rather than plow headlong into oncoming traffic. Does a Tesla understand that?

RevRico
RevRico PowerDork
12/8/19 6:27 p.m.

If a new Subaru can track your eyes and alert you when you look away from the road. Then Tesla should be able foresee sleeping/reading/turning around should have mitigation's in place to address that to avoid the use of driver assist as autopilot

Subaru does what now? Ok, never ever buy another subie that isn't an 04 STI. Got it. 

1988RedT2
1988RedT2 MegaDork
12/8/19 6:49 p.m.
_ said:
No Time said

I like technology, but I don't want to see tech launched that has known risks that haven't been mitigated. 
 

What, you mean like windows 10, every apple product, every video game launch, GoPro, every app released, and every website? 
Sadly it is the new normal to release trash and then fix it on the fly. The world wasn't like that twenty five years ago. You didn't release your product unless it was perfect, because it's flaws would bury your company. 

Actually, Microsoft has been releasing half-baked crap since at least '95, which is almost exactly 25 years ago, so I have to respectfully disagree.

No Time
No Time Dork
12/8/19 9:07 p.m.

In reply to RevRico :

Here's the marketing clip on it:

Subaru distracted driver alert

I'd actually prefer this nanny over the lane assist features that can actively steer the car instead of just providing an alert. 

Brett_Murphy
Brett_Murphy UltimaDork
12/8/19 10:56 p.m.

Self-driving cars are never going to get better unless they're tested in real world conditions. Those real world conditions don't have to, nor do I think they should, happen in live traffic during everyday life. 

Autopilot doesn't seem to be ready for this, either.
 

z31maniac
z31maniac MegaDork
12/9/19 8:28 a.m.
No Time said:

The problem I see is that the Autopilot is software based. Software follows a fixed set of rules and could be expected to have the same behavior every time.

 

If only it actually worked like that, software in general, not just autopilot. smiley

RX8driver
RX8driver Reader
12/9/19 8:33 a.m.
Keith Tanner said:
RX8driver said:

Part of the problem is marketing. They say "full self driving" and people think that's what it is, instead of an advanced cruise control. The world's best computer is still between your ears.

FYI, full self driving has not been released on Tesla’s other than in news stories. It’s kinda like Al Gore claiming he invented the internet - that was a joke in a Wired article but it somehow became “truth”.

 

They market it, on their own website, as "Full Self-Driving Capability".

No Time
No Time Dork
12/9/19 8:58 a.m.

In reply to z31maniac :

The problem is that does, the same set of inputs produce the same output.

Any illusion of variability is due to variation in the inputs. 

dculberson
dculberson MegaDork
12/9/19 9:58 a.m.

I think their insistence on visual camera only is a mistake. Musk has said "anyone relying on LiDar is doomed," but I think to have a truly safer vehicle you need more than just the visual spectrum. Cameras are too easily overwhelmed and provide inconsistent data, no matter their quality. At $9200 for the option, they sure can afford to put a few more sensors on there.

infinitenexus
infinitenexus Reader
12/9/19 11:27 a.m.

I think autodriving cars could work for 5 years straight successfully with zero accidents while human drivers crash all the time, and then once the first person dies in an autopiloted car there's people that would flip out and curse the technology and talk about how it's the worst thing ever, even when it's statistically safer than people driving.  My money is on this guy didn't have the autopilot on.  I wonder if there's a way to check the computer and see?

z31maniac
z31maniac MegaDork
12/9/19 3:06 p.m.
infinitenexus said:

I think autodriving cars could work for 5 years straight successfully with zero accidents while human drivers crash all the time, and then once the first person dies in an autopiloted car there's people that would flip out and curse the technology and talk about how it's the worst thing ever, even when it's statistically safer than people driving.  My money is on this guy didn't have the autopilot on.  I wonder if there's a way to check the computer and see?

You just described nuclear energy. 

eastsideTim
eastsideTim UberDork
12/9/19 3:45 p.m.
lnlogauge said:

In reply to pres589 (djronnebaum) :

Because Tesla has logged millions of miles on autopilot, and you hear of every single time there's an accident. Even with the accidents, autopilot is 9 times safer than the average human driver.  also, just because accidents happen doesn't mean technology should stop. 

I'm on the side that it wasn't actually on. If Tesla wasnt able to see stopped vehicles, there would be alot more headlines.

In reply to RX8Driver :

The computer in between your ears makes errors, falls asleep, and sucks at processing more than one thought at a time. Absolutely no. 

​​​

 

 

The autopilot safety statistics are not entirely agreed upon.  It doesn’t help that autopilot is only supposed to be on during a limited set of circumstances.  If it is only running under ideal circumstances, it’s a lot easier to claim it’s safe than if it has to deal with everything human drivers deal with.

Self driving tech will continue to improve, but Musk and its other advocates should stop exaggerating its capabilities.

 

Kreb
Kreb UberDork
12/9/19 5:23 p.m.

So there are roughly 35,000 auto accidents and 36,000 fatalities per year, and occasionally a Tesla's involved. Yes, the technology isn't entirely  mature, and yes, Musk gave it a bad/misleading name. But until someone can prove to me that the technology increases the probability of an accident, these threads will rate a big meh from me.

Ideally the car and the driver will be a team. Most of the time the driver's going to be the better judge of appropriate behavior, and on those occasions that he or she isn't, big brother might just save his ass. Where there are problems is usually when the human element of the team expects big bro to do more than it's share.   

Keith Tanner
Keith Tanner MegaDork
12/9/19 5:29 p.m.
infinitenexus said:

I think autodriving cars could work for 5 years straight successfully with zero accidents while human drivers crash all the time, and then once the first person dies in an autopiloted car there's people that would flip out and curse the technology and talk about how it's the worst thing ever, even when it's statistically safer than people driving.  My money is on this guy didn't have the autopilot on.  I wonder if there's a way to check the computer and see?

These things are black boxed out the wazoo. There's 360* video and full datalogging. You can bet that Tesla knows exactly what happened.

Keith Tanner
Keith Tanner MegaDork
12/9/19 5:37 p.m.
RX8driver said:
Keith Tanner said:
RX8driver said:

Part of the problem is marketing. They say "full self driving" and people think that's what it is, instead of an advanced cruise control. The world's best computer is still between your ears.

FYI, full self driving has not been released on Tesla’s other than in news stories. It’s kinda like Al Gore claiming he invented the internet - that was a joke in a Wired article but it somehow became “truth”.

 

They market it, on their own website, as "Full Self-Driving Capability".

It appears I was wrong! Thanks. 

Kreb
Kreb UberDork
12/9/19 6:02 p.m.

They also are touting the summon feature, which has been said to suck. Wow. 

Brett_Murphy
Brett_Murphy UltimaDork
12/9/19 8:00 p.m.
infinitenexus said:

I wonder if there's a way to check the computer and see?


That computer is going to tell investigators everything, including what that guy had for lunch last Thursday.

RX8driver
RX8driver Reader
12/10/19 7:57 a.m.
Kreb said:

So there are roughly 35,000 auto accidents and 36,000 fatalities per year, and occasionally a Tesla's involved. Yes, the technology isn't entirely  mature, and yes, Musk gave it a bad/misleading name. But until someone can prove to me that the technology increases the probability of an accident, these threads will rate a big meh from me.

Ideally the car and the driver will be a team. Most of the time the driver's going to be the better judge of appropriate behavior, and on those occasions that he or she isn't, big brother might just save his ass. Where there are problems is usually when the human element of the team expects big bro to do more than it's share.   

Yes, except there are many reported examples of it reacting unusually to some fairly typical, non-emergency situations, like seeing tar lines on the road as road lines and moving over to stay in the newly perceived lane.

Vigo
Vigo MegaDork
12/10/19 8:29 a.m.

Yes the average person doesn't secure their dogs. But I'm talking about this person in that Tesla. 

Yeah, im just selfishly derailing on a tangent because the actual issue of the thread is pretty blah to me. I kinda feel the same way Kreb does:

 Yes, the technology isn't entirely  mature, and yes, Musk gave it a bad/misleading name. But until someone can prove to me that the technology increases the probability of an accident, these threads will rate a big meh from me. 

But either way we're talking about the decisions the human made about safety, like carrying unsecured passengers and not looking at the road. My personal opinion is I can probably tell as much or more about a person's grasp of safety by seeing them put a dog into their car for a non-critical reason, as i could tell by the fact that they use autopilot. A person might even legitimately think the Autopilot is going to make them safer. They could be wrong but it's aLOT more plausible than "I believe it is legit ok to carry passengers that aren't belted". That's just stupid.   

 

Another funny thing about the safety of Teslas is that it might be safer to take a 100% chance of getting into a wreck with a Tesla then it is to take a 20% chance of getting into a wreck inside some other cars. Teslas are absurdly safe in crashes compared to 'fleet average' of cars their size. Assuming... you have a belt on....

Kreb
Kreb UberDork
12/10/19 8:38 a.m.

That "9 times safer" stat interests me. I'd like to see a graph on what exactly constitutes the "average driver" and what  makes up the extremes? I'm a pretty average guy and I've never been in an accident on the freeway. Tesla's 9 times better than me? (That's a joke).

infinitenexus
infinitenexus Reader
12/10/19 10:14 a.m.
Brett_Murphy said:

That computer is going to tell investigators everything, including what that guy had for lunch last Thursday.

"Lieutenant, the computer says this man had a cuban sandwich for lunch and requested no pickles"

"WHAT?  He'll rot for this one!"

Keith Tanner
Keith Tanner MegaDork
12/10/19 10:18 a.m.
Kreb said:

That "9 times safer" stat interests me. I'd like to see a graph on what exactly constitutes the "average driver" and what  makes up the extremes? I'm a pretty average guy and I've never been in an accident on the freeway. Tesla's 9 times better than me? (That's a joke).

I would assume it's based on accidents per miles driven, with perhaps some hand-waving to isolate it to conditions where Autopilot would be active (freeways, not surface streets for example). 

1 2

You'll need to log in to post.

Our Preferred Partners
etedgYbZnbo6MJpYES9tf3wTWXqPzAq6Z28pay129SXCxriOm8JzzWhHtTDavsD3