While it may not come to driving down the shoulder with a triangle on the back, there will be some roads that are restricted to autonomous vehicles only eventually. It'll take a long time if only because of the slow turnover of the national fleet. The average age of cars in the US is 11.5 years.
Fun fact: in 1977, the average age of cars in the US was 5.5 years.
In reply to Joe Gearin:
Joe
Like it or not, GRM is part of a useful lobbying segment. You should take advantage of that.
Get involved.
I personally find it hilarious that everybody thinks we will have full-autonomous superhighways within mere years when I can be standing under a cell phone tower and lose signal. Or have my wifi router go out because it's raining. Or have the servers at work crash because of a voltage blip. Also like 50% of the US still doesn't even have cell signal let alone constant 4G everywhere. Also things block signals and sensors like buildings, clouds, mist, fog, and even dirt on the lenses. It's just hilarious to me!
Not all electronics are bargain-bin Chinese consumer-grade stuff...yours in particular seem to be especially bad, I've never had any of that stuff happen!
Autonomous cars don't need to use wireless network signals, not for now, maybe not ever. Lenses just need wiping mechanisms, either wipers or the rotating-cylinder system you see on F1 onboard cameras.
Lof8
HalfDork
7/1/16 12:00 p.m.
GameboyRMH wrote:
Not all electronics are bargain-bin Chinese consumer-grade stuff...yours in particular seem to be especially bad, I've never had any of that stuff happen!
Autonomous cars don't need to use wireless network signals, not for now, maybe not ever. Lenses just need wiping mechanisms, either wipers or the rotating-cylinder system you see on F1 onboard cameras.
The more mechanisms, the more opportunity for one to fail.
Lof8 wrote:
GameboyRMH wrote:
Not all electronics are bargain-bin Chinese consumer-grade stuff...yours in particular seem to be especially bad, I've never had any of that stuff happen!
Autonomous cars don't need to use wireless network signals, not for now, maybe not ever. Lenses just need wiping mechanisms, either wipers or the rotating-cylinder system you see on F1 onboard cameras.
The more mechanisms, the more opportunity for one to fail.
OR- the more mechanisms, the more redundancy and information to reduce risk. Either way..
Lof8 wrote:
GameboyRMH wrote:
Not all electronics are bargain-bin Chinese consumer-grade stuff...yours in particular seem to be especially bad, I've never had any of that stuff happen!
Autonomous cars don't need to use wireless network signals, not for now, maybe not ever. Lenses just need wiping mechanisms, either wipers or the rotating-cylinder system you see on F1 onboard cameras.
The more mechanisms, the more opportunity for one to fail.
But they also get more reliable over time, especially electronics. Autonomous cars really only need a few more components than manually driven cars: LIDAR and optical sensors on the outside, one more computer (when modern cars already have dozens), maybe a touchscreen on the off chance a modern car doesn't already have one, and if the car doesn't have direct-drive EPS, a steering actuator on the inside. Brakes are controlled through the ABS system.
Turns out it's not just the truck driver who said the Tesla driver was watching a movie:
http://www.reuters.com/article/us-tesla-autopilot-idUSKCN0ZH4VO
Harry Potter? What an embarrassing movie to be watching when killed...
Wall-e
MegaDork
7/1/16 1:50 p.m.
In reply to GameboyRMH:
Maybe the Teslas are so advanced that they switch off your porn and go to a safe default movie to save the family more grief.
SVreX
MegaDork
7/1/16 1:50 p.m.
So, Tesla gives owners a feature that reduces their attentiveness by design, but tells drivers to pay better attention when using it?
Wall-e
MegaDork
7/1/16 1:54 p.m.
Apexcarver wrote:
Driver was trusting a public "beta" that cautions you to have your hands on the wheel and be ready to take over at any time...
You aren't surprised someone ignored a warning are you? The only ones people pay the slightest attention to are wet paint and dry clean only.
Javelin wrote:
In reply to Joe Gearin:
Have you not seen the anime eX-Driver?
I have not. I'm an old guy----not really into cartoons unless they include Foghorn Leghorn, or Wiley the Coyote.
SVreX wrote:
So, Tesla gives owners a feature that reduces their attentiveness by design, but tells drivers to pay better attention when using it?
It makes sense if you look at it not as Tesla offering a feature, but as Tesla getting a bunch of volunteer beta testers for their autonomous driving AI...they just found a bug.
Joe Gearin wrote:
Javelin wrote:
In reply to Joe Gearin:
Have you not seen the anime eX-Driver?
I have not. I'm an old guy----not really into cartoons unless they include Foghorn Leghorn, or Wiley the Coyote.
As someone who's about your age - I'd recommend it. If only because the Stratos, the Elan and the Seven are drawn with love and affection.
I had a similar collision years ago. A tractor trailer turned left in front of me on a divided highway. Too close to stop, ditches on both sides of the road (driver was leaving the roadway on purpose to get to enter a field- no intersection). The trailer was blocking the road. In a split second, I decided to steer for the trailer wheels rather than go under, and saved my life. To do that I had to reduce braking pressure and position my car to the left, on a left hand curve, in the most panic of panic stops imaginable. No ABS, so just locking them up would have meant sliding under the trailer. I had to just know what to do and do it, and I owe that to my track expirience.
I think a good driver would do much better than an autonomous car in an emergency situation, and would likely have a better chance of avoiding the emergency in the first place. I think the problem is so few people actually know how to drive a car. They know how to operate a car, not how to drive. They don't understand the physics involved, how a car reacts to inputs at it's limits. They don't look far enough ahead. It's too late to deal with where your car is right now, you should be thinking about where it is going to be. Most drivers are reactive. That's the problem with autonomous cars, they are also reactive, just better at it than your average driver. But a good driver is proactive, and I don't see autonomous cars being able to match that. I'd say if you really wanted to make the roads safer, take the money for autonomous cars and put it towards real driver's training and enhance our licensing system. I think most drivers are capable of becoming good drivers. Then there is the small percentage that simply should just not be driving, and for some reason our society knows that and is okay with it.
GameboyRMH wrote:
SVreX wrote:
So, Tesla gives owners a feature that reduces their attentiveness by design, but tells drivers to pay better attention when using it?
It makes sense if you look at it not as Tesla offering a feature, but as Tesla getting a bunch of volunteer beta testers for their autonomous driving AI...they just found a bug.
At the expense of the general driving public. If it couldn't detect a semi truck I seriously doubt it would detect me on my bike, or my wife and kid in her car. I'd say Tesla is VERY lucky that the operator was the only casualty. If this car had broadsided a bus full of people They would have a very hard time trying lay the blame on the operator.
I don't see why autonomous cars can't be as proactive as meat. Since they can pay attention to more than one thing at a time, they could potentially be even more proactive.
Interesting article on Jalopnik about a possible blind spot in the Tesla's sensors that makes it less likely to pick up a tractor trailer straddling the road.
http://jalopnik.com/does-teslas-semi-autonomous-driving-system-suffer-from-1782935594
Nick (picaso) Comstock wrote:
GameboyRMH wrote:
It makes sense if you look at it not as Tesla offering a feature, but as Tesla getting a bunch of volunteer beta testers for their autonomous driving AI...they just found a bug.
At the expense of the general driving public. If it couldn't detect a semi truck I seriously doubt it would detect me on my bike, or my wife and kid in her car.
It would detect all of those things because they don't have 4ft+ tall gaps between the vehicle and the ground and would thus fall into the range of the nose radar, unlike the middle of a semi trailer.
The Tesla's radar saw the gap in front of the trailer's rear wheels as an open space, and then the optical sensor saw the trailer itself as an open space because it blended into the sky. This can only be solved with RADAR or LIDAR scanners that can scan ahead to the full height of the car, or possibly with an infrared/UV-spectrum camera for the optical sensor that can tell the sky from similarly colored objects.
Keith Tanner wrote:
Interesting article on Jalopnik about a possible blind spot in the Tesla's sensors that makes it less likely to pick up a tractor trailer straddling the road.
http://jalopnik.com/does-teslas-semi-autonomous-driving-system-suffer-from-1782935594
Huh, this suggests that the color of the trailer was unimportant as the car does indeed behave as if it's height isn't much more than that of the car's hood...that's a huge and basic design flaw, it's hard to believe Tesla made a mistake that huge.
Keith Tanner wrote:
As with every accident, hero drivers will always claim they could do better. This will not change, the 98% of drivers who are better than 98% of other drivers will always say they could have avoided the accident. The goal of autonomous cars (and driving aids) is to improve the odds overall. Sure, it'll be possible to second-guess cherry-picked accidents but overall, I'll take my chances with a well-programmed system that's always paying attention over someone who is lurching from accident to accident while drunk and dicking with their phone.
Sounds like the driver of this car liked to test the limits of his system, and has posted a video to YouTube of his car (Tessy) saving his life once before when a truck cut him off. "Autopilot" was an unfortunate name for the Tesla system, as it means different things to the general public (the car, it does everything!) and pilots (cruise control for the sky). The owners interpret it as full autonomy.
I couldn't agree more. Most who read this think they could have or would have done better than a very well programmed system. I'm not saying Teslas' system is fool proof. I have no experience with it. But I agree that a truly well designed system will do better at monitoring the roadway than most dweeb drivers on the road today. Just like ABS, airbags, pre-tensioning seatbelts, stability control, etc... I'm all for a driver aid that can help a driver be safe on the roadway. But too many people think of these aids as Jetsons' style autonomy (the Jetsons was autonomous, wasn't it?). I enjoy driving like the rest of you, and would never want a car that takes it away from me. But a car that can help me avoid an accident, or survive one is OK in my book.
I can also tell you that the number of accidents I see on a weekly basis from drivers turning left in front of someone else is huge. Second only to rear end accidents. So many people don't seem to pay attention when making a left, and also people seem to put blinders on when going through an intersection on a green light.
In reply to Keith Tanner:
I don't see why autonomous cars can't be as proactive as meat. Since they can pay attention to more than one thing at a time, they could potentially be even more proactive.
Interesting article on Jalopnik about a possible blind spot in the Tesla's sensors that makes it less likely to pick up a tractor trailer straddling the road.
http://jalopnik.com/does-teslas-semi-autonomous-driving-system-suffer-from-1782935594
What you are describing is artificial intelligence, which is nowhere near what automated cars are capable of. They are just input-output. If your sensors detect this, then do this. Something has to happen first for them to react to it. A good driver monitors their surroundings for potential problems. The inattentive or aggressive driver that would raise a flag to a good driver. The car driving down the freeway on a flat tire, that you know is going to blow. The kid new to riding his bike that is wobbly and to close to the street. The dog that just escaped it's owner and is likely to dart into the street. There are clues everywhere that something bad is likely to happen. Until computers can recognize those clues and not just react after the fact, they will never be as good as people.
Ah, but autonomous cars can identify all those things - every other moving vehicle and pedestrian in the vicinity. Their velocities can be monitored and trajectories calculated. If one starts to move in the wrong direction, the car can react. That's really all people do. People can't monitor all of them simultanously, though, and may go through a moment of denial ("that guy can't possibly be turning!") that the computer cannot. People can also be distracted, even if it's a matter of adjusting the volume on the stereo or seeing a billboard. Computers can't.
It's the 98% of drivers that consider themselves better than 98% of all other drivers that will have trouble accepting this. But just like ABS can modulate brakes better than any driver because it can modulate individual wheels, even the best driver can't pay full attention to everything all the time. It's not physically possible. You have to meter out your attention between what you have calculated to be the biggest threats. If one of the other things in your vicinity turns out to be a bigger threat than your expected, you're taken by surprise.
In reply to Keith Tanner:
But will it see the three year old bouncing the ball on the sidewalk and slow before he darts in into the street, or will it simply react to him after he darts? I have a three year old, and his trajectory can be very hard to calculate.
Quite possibly. That's what Google's been doing for the last few years, training their cars in urban environments. It's quite possible that a small human is considered a more random thing than a big human.
It's easy to play the "what if" game. People are expecting 100% infallibility out of these things. But really, in order to be an improvement to overall safety, they only need to be an improvement over the average driver at any given time. There's a good chance they'll go far beyond that before being deployed.
In reply to Keith Tanner:
But really, in order to be an improvement to overall safety, they only need to be an improvement over the average driver at any given time.
Well, that's a pretty low bar.
With all honesty, I've got very mixed emotions on the subject. I love driving. But I drive 40,000 miles per year, and I see a lot of stupidity on the road. In my opinion, the better cars get, the worse the drivers get. Remember when insurance companies discovered that ABS resulted in lots of drivers avoiding small accidents and steering into bigger ones? In my world, Takata airbags are a step forward- hit something and a claymore goes off on your steering wheel. Still want to tailgate?