Advocate of self-driving car killed because he ignored warnings to take control
#1
Senior Member
Thread Starter
Advocate of self-driving car killed because he ignored warnings to take control
#2
Full Member
Join Date: Sep 2012
Posts: 398
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 214 Post(s)
Likes: 0
Liked 0 Times
in
0 Posts
An earlier investigation by the National Highway Traffic Safety Administration found that the crash was not the result of any defect in Tesla's autopilot feature, which can keep a car in a lane and brake to avoid traffic and other obstacles.
Not defective? The autopilot feature is supposed to brake to avoid collision. If the Tesla failed to brake as a very large semi truck turned directly in its path, then the autopilot feature failed to do its job hence it was defective.
#3
Senior Member
Join Date: Jun 2002
Location: Montreal, Quebec
Posts: 5,925
Mentioned: 16 Post(s)
Tagged: 0 Thread(s)
Quoted: 1818 Post(s)
Liked 1,693 Times
in
974 Posts
This was not a self driving car. This was a car that had driving aids that would allow the owner to have the car help drive for very limited amounts of time. The "driver" chose to stretch those limits beyond what the car was designed to do. What this has to do with the car's ability to be safe around cyclists is hard to understand, and it is a very poor example of anything to do with self driving cars, which this wasn't
#4
Me duelen las nalgas
Join Date: Aug 2015
Location: Texas
Posts: 13,513
Bikes: Centurion Ironman, Trek 5900, Univega Via Carisma, Globe Carmel
Mentioned: 199 Post(s)
Tagged: 0 Thread(s)
Quoted: 4559 Post(s)
Liked 2,802 Times
in
1,800 Posts
Not a self driving car. Closer to using cruise control inappropriately, such as in heavy traffic.
#5
Senior Member
Thread Starter
An earlier investigation by the National Highway Traffic Safety Administration found that the crash was not the result of any defect in Tesla's autopilot feature, which can keep a car in a lane and brake to avoid traffic and other obstacles.
Not defective? The autopilot feature is supposed to brake to avoid collision. If the Tesla failed to brake as a very large semi truck turned directly in its path, then the autopilot feature failed to do its job hence it was defective.
#6
genec
Join Date: Sep 2004
Location: West Coast
Posts: 27,079
Bikes: custom built, sannino, beachbike, giant trance x2
Mentioned: 86 Post(s)
Tagged: 0 Thread(s)
Quoted: 13658 Post(s)
Liked 4,532 Times
in
3,158 Posts
An autopilot in an airplane is one thing. But this is an example of why I don't like self-driving cars. He relied too much on it. It can be improperly programmed, just like an airplane. Sure, The purpose of the self-driving is to reduce the errors made by humans. But just like has been debated for some time in the airline industry. That there is too much reliance on the autopilot.
When the fire alarm goes off in your home, do you just stand there and let the flames sweep over you? That is essentially what he did. He ignored the manual, he ignored the warnings and he paid the price.
BTW, the autopilot in an airplane (non military) does the same thing as the autopilot in a boat... it keeps you on the same course and heading... it doesn't look for obstacles, it doesn't avoid collisions, it doesn't see, it doesn't even know the terrain... Yes, it will warn you if radar detects ground or an object... but YOU still have to take command. It really is not an autoPILOT. It is a steering wheel holder... that's about it.
Until recently using a marine GPS could plow you right into land... the darn things displayed land, but had no "awareness" of it. You could pick a destination and the GPS would route you right into land. Only recently have such marine systems been designed to actually determine if there is land in the way of a boat... and at that, such systems tend to warn, not actually steer.
So like self driving cars, real "autopilots" do not yet exist.
Bottom line... keep your hands on the wheel and your eyes on the road.
#7
Senior Member
Join Date: Nov 2014
Location: Eugene, Oregon, USA
Posts: 27,547
Mentioned: 217 Post(s)
Tagged: 0 Thread(s)
Quoted: 18372 Post(s)
Liked 4,507 Times
in
3,350 Posts
An earlier investigation by the National Highway Traffic Safety Administration found that the crash was not the result of any defect in Tesla's autopilot feature, which can keep a car in a lane and brake to avoid traffic and other obstacles.
Not defective? The autopilot feature is supposed to brake to avoid collision. If the Tesla failed to brake as a very large semi truck turned directly in its path, then the autopilot feature failed to do its job hence it was defective.
In this case, the car hit the trailer of a truck that pulled across its lanes of traffic at an intersection.
To be safe in traffic, one has to have a bit of a predictive ability. It is not good enough to say that one's lane is clear at the moment. Rather, one needs to predict that it will remain clear when one arrives at a spot in the future. I.E. watch for cross traffic, and vehicles that look like they're going to pull in front of a person.
Bicycles swerving due to obstacles?
But, it is also a sign of why "self driving" will take more time to mature.
#8
Full Member
Join Date: Sep 2012
Posts: 398
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 214 Post(s)
Likes: 0
Liked 0 Times
in
0 Posts
That's what I said. I stated in my previous post "a very large semi truck turned directly in its path."
Which was what happened. The truck was making a left turn at an intersection in the path of the Tesla.
Tesla has stated "neither autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied."
I will give Tesla credit for admitting their autopilot feature failed to do its job because it could not tell the difference between a white truck and the daytime sky. To me that is a defect. I'm not sure why the NHTSA would say it wasn't a defect when Tesla themselves have essentially admitted it.
Which was what happened. The truck was making a left turn at an intersection in the path of the Tesla.
Tesla has stated "neither autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied."
I will give Tesla credit for admitting their autopilot feature failed to do its job because it could not tell the difference between a white truck and the daytime sky. To me that is a defect. I'm not sure why the NHTSA would say it wasn't a defect when Tesla themselves have essentially admitted it.
#9
Senior Member
Join Date: Jun 2008
Location: Vancouver, BC
Posts: 9,201
Mentioned: 11 Post(s)
Tagged: 0 Thread(s)
Quoted: 1186 Post(s)
Liked 289 Times
in
177 Posts
I will give Tesla credit for admitting their autopilot feature failed to do its job because it could not tell the difference between a white truck and the daytime sky. To me that is a defect. I'm not sure why the NHTSA would say it wasn't a defect when Tesla themselves have essentially admitted it.
#10
genec
Join Date: Sep 2004
Location: West Coast
Posts: 27,079
Bikes: custom built, sannino, beachbike, giant trance x2
Mentioned: 86 Post(s)
Tagged: 0 Thread(s)
Quoted: 13658 Post(s)
Liked 4,532 Times
in
3,158 Posts
That's what I said. I stated in my previous post "a very large semi truck turned directly in its path."
Which was what happened. The truck was making a left turn at an intersection in the path of the Tesla.
Tesla has stated "neither autopilot nor the driver noticed the white sky of the tractor trailer against a brightly lit sky, so the brake was not applied."
I will give Tesla credit for admitting their autopilot feature failed to do its job because it could not tell the difference between a white truck and the daytime sky. To me that is a defect. I'm not sure why the NHTSA would say it wasn't a defect when Tesla themselves have essentially admitted it.
Which was what happened. The truck was making a left turn at an intersection in the path of the Tesla.
Tesla has stated "neither autopilot nor the driver noticed the white sky of the tractor trailer against a brightly lit sky, so the brake was not applied."
I will give Tesla credit for admitting their autopilot feature failed to do its job because it could not tell the difference between a white truck and the daytime sky. To me that is a defect. I'm not sure why the NHTSA would say it wasn't a defect when Tesla themselves have essentially admitted it.
That the human also failed to notice and PREDICT the large truck crossing is indicative of how complicated this stuff can really be. I believe the findings also indicated that the Tesla was driving over the posted speed for the area, and that the truck did not actually have clearance to cross, even at the posted speed... and that the truck driver was relying a bit on the kindness of strangers. All in all, a recipe for disaster. But hey, let's blame the car, eh?
#11
Full Member
Join Date: Sep 2012
Posts: 398
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 214 Post(s)
Likes: 0
Liked 0 Times
in
0 Posts
Many cars have automatic braking, but are not self-driving. Being self-driving has nothing to do with it.
When the automatic braking feature fails to work, that is the driver's fault? Automatic braking is supposed to kick in during emergency situations, which by definition doesn't give enough time for a human driver to react.
Tesla admitted it didn't work as it should. If they can't get even get the car to brake on its own what hope is there for a 100% self-driving car?
#12
genec
Join Date: Sep 2004
Location: West Coast
Posts: 27,079
Bikes: custom built, sannino, beachbike, giant trance x2
Mentioned: 86 Post(s)
Tagged: 0 Thread(s)
Quoted: 13658 Post(s)
Liked 4,532 Times
in
3,158 Posts
Then it shouldn't be called autopilot should it?
Many cars have automatic braking, but are not self-driving. Being self-driving has nothing to do with it.
When the automatic braking feature fails to work, that is the driver's fault? Automatic braking is supposed to kick in during emergency situations, which by definition doesn't give enough time for a human driver to react.
Tesla admitted it didn't work as it should. If they can't get even get the car to brake on its own what hope is there for a 100% self-driving car?
Many cars have automatic braking, but are not self-driving. Being self-driving has nothing to do with it.
When the automatic braking feature fails to work, that is the driver's fault? Automatic braking is supposed to kick in during emergency situations, which by definition doesn't give enough time for a human driver to react.
Tesla admitted it didn't work as it should. If they can't get even get the car to brake on its own what hope is there for a 100% self-driving car?
Autobraking as you describe it, is not available... AEB (automatic Emergency braking) is... and if it fails, then you are right back where you were with a human driver... you are gonna crash. Don't drive like you depend on this feature.
Bottom line, a self driving car only has to be a bit better than a human, but will NEVER be 100% perfect... but then, neither are humans.
#13
Full Member
Join Date: Sep 2012
Posts: 398
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 214 Post(s)
Likes: 0
Liked 0 Times
in
0 Posts
No, it should not be called autopilot... and Tesla has been criticized for that name.
Autobraking as you describe it, is not available... AEB (automatic Emergency braking) is... and if it fails, then you are right back where you were with a human driver... you are gonna crash. Don't drive like you depend on this feature.
Autobraking as you describe it, is not available... AEB (automatic Emergency braking) is... and if it fails, then you are right back where you were with a human driver... you are gonna crash. Don't drive like you depend on this feature.
What the hell is the point of a self driving car or, even just emergency auto braking if you can't depend on it? Its absurd.
I don't care for self-driving cars and would never own even if it were free. The problem is these self-driving or partially self-driving cars aren't just putting their owners lives at risk they are putting everyone on the road at risk.
I never consented to be a guinea pig for this hairbrain idea.
#14
Senior Member
Join Date: Nov 2014
Location: Eugene, Oregon, USA
Posts: 27,547
Mentioned: 217 Post(s)
Tagged: 0 Thread(s)
Quoted: 18372 Post(s)
Liked 4,507 Times
in
3,350 Posts
That's what I said. I stated in my previous post "a very large semi truck turned directly in its path."
Which was what happened. The truck was making a left turn at an intersection in the path of the Tesla.
Tesla has stated "neither autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied."
Which was what happened. The truck was making a left turn at an intersection in the path of the Tesla.
Tesla has stated "neither autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied."
If the trailer was being backed up, then it would likely have been an industrial neighborhood, and speeds still should have been reduced.
I believe the findings also indicated that the Tesla was driving over the posted speed for the area, and that the truck did not actually have clearance to cross, even at the posted speed... and that the truck driver was relying a bit on the kindness of strangers. All in all, a recipe for disaster. But hey, let's blame the car, eh?
On my bike, inevitably if I hit a yellow light upon entering a 4 or 5 lane road, it will be red by the time I get to the other side, and the opposing traffic will get a green. I have one left turn that I hit occasionally. If I take off as soon as the light turns green, it will turn yellow before I get a quarter way through the intersection, and red, with cross traffic getting green before I completely clear the intersection (with at least an extra second or so before traffic should arrive across to my position).
I don't want a Tesla to slam into me because it had a green light and its lane was clear moments before.
#15
genec
Join Date: Sep 2004
Location: West Coast
Posts: 27,079
Bikes: custom built, sannino, beachbike, giant trance x2
Mentioned: 86 Post(s)
Tagged: 0 Thread(s)
Quoted: 13658 Post(s)
Liked 4,532 Times
in
3,158 Posts
Is that what they're going to say when a fully self-driving car drives you off the cliff? You shouldn't depend on it?
What the hell is the point of a self driving car or, even just emergency auto braking if you can't depend on it? Its absurd.
I don't care for self-driving cars and would never own even if it were free. The problem is these self-driving or partially self-driving cars aren't just putting their owners lives at risk they are putting everyone on the road at risk.
I never consented to be a guinea pig for this hairbrain idea.
What the hell is the point of a self driving car or, even just emergency auto braking if you can't depend on it? Its absurd.
I don't care for self-driving cars and would never own even if it were free. The problem is these self-driving or partially self-driving cars aren't just putting their owners lives at risk they are putting everyone on the road at risk.
I never consented to be a guinea pig for this hairbrain idea.
You are making suppositions of technology that isn't even there yet. Remember airbags... their early adoption killed a few people... yet they save hundreds of lives today. Do you drive like that airbag is gonna save your life? Do you depend on it?
#16
genec
Join Date: Sep 2004
Location: West Coast
Posts: 27,079
Bikes: custom built, sannino, beachbike, giant trance x2
Mentioned: 86 Post(s)
Tagged: 0 Thread(s)
Quoted: 13658 Post(s)
Liked 4,532 Times
in
3,158 Posts
If the truck was moving, then that broad side of the trailer should have been preceded by the side of the cab. Also invisible? How much delay between cab and trailer and impact?
If the trailer was being backed up, then it would likely have been an industrial neighborhood, and speeds still should have been reduced.
A large part of driving is don't run over whatever is in front of you.
On my bike, inevitably if I hit a yellow light upon entering a 4 or 5 lane road, it will be red by the time I get to the other side, and the opposing traffic will get a green. I have one left turn that I hit occasionally. If I take off as soon as the light turns green, it will turn yellow before I get a quarter way through the intersection, and red, with cross traffic getting green before I completely clear the intersection (with at least an extra second or so before traffic should arrive across to my position).
I don't want a Tesla to slam into me because it had a green light and its lane was clear moments before.
If the trailer was being backed up, then it would likely have been an industrial neighborhood, and speeds still should have been reduced.
A large part of driving is don't run over whatever is in front of you.
On my bike, inevitably if I hit a yellow light upon entering a 4 or 5 lane road, it will be red by the time I get to the other side, and the opposing traffic will get a green. I have one left turn that I hit occasionally. If I take off as soon as the light turns green, it will turn yellow before I get a quarter way through the intersection, and red, with cross traffic getting green before I completely clear the intersection (with at least an extra second or so before traffic should arrive across to my position).
I don't want a Tesla to slam into me because it had a green light and its lane was clear moments before.
#17
Full Member
Join Date: Sep 2012
Posts: 398
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 214 Post(s)
Likes: 0
Liked 0 Times
in
0 Posts
I commend your logic... you went from a situation in which EMERGENCY systems fail, to fully damning FUTURE self driving technology. By your level of thinking we should have stopped flying planes the first time one crashed, we should have stopped driving cars the first time there was a collision (which BTW was back in 1896). We should have never gone to the moon after the failure of Apollo 1, and Apollo 13.
You are making suppositions of technology that isn't even there yet. Remember airbags... their early adoption killed a few people... yet they save hundreds of lives today. Do you drive like that airbag is gonna save your life? Do you depend on it?
You are making suppositions of technology that isn't even there yet. Remember airbags... their early adoption killed a few people... yet they save hundreds of lives today. Do you drive like that airbag is gonna save your life? Do you depend on it?
2. Even if the airbag doesn't deploy for some reason, your seatbelt will probably save you.
3. Airbags still kill people today. Ever heard of the recent massive Takata airbag recalls?
What hope is there for driverless cars when they can't even get an airbag to work right, which is far simpler than a driverless car?
#18
genec
Join Date: Sep 2004
Location: West Coast
Posts: 27,079
Bikes: custom built, sannino, beachbike, giant trance x2
Mentioned: 86 Post(s)
Tagged: 0 Thread(s)
Quoted: 13658 Post(s)
Liked 4,532 Times
in
3,158 Posts
1. An airbag isn't nearly as complex as a self-driving car.
2. Even if the airbag doesn't deploy for some reason, your seatbelt will probably save you.
3. Airbags still kill people today. Ever heard of the recent massive Takata airbag recalls?
What hope is there for driverless cars when they can't even get an airbag to work right, which is far simpler than a driverless car?
2. Even if the airbag doesn't deploy for some reason, your seatbelt will probably save you.
3. Airbags still kill people today. Ever heard of the recent massive Takata airbag recalls?
What hope is there for driverless cars when they can't even get an airbag to work right, which is far simpler than a driverless car?
Also, if seatbelts were so effective, why mandated airbags?
Yeah the failures you mentioned in #3 were from a manufacture taking short cuts... they filed for bankruptcy in Japan right now... Air bag maker Takata bankruptcy filing expected in Japan, US - StarTribune.com
Go ahead and hate future technology... you won't be alone... throughout history more has been promised than delivered, we still don't have flying cars... and some folks just fear technological change. Ever hear of Luddites?
What the Luddites Really Fought Against | History | Smithsonian
Read it and commiserate... you want technology that works, and works right.
Of course bear in mind that the driver of that Tesla... he pushed technology... until it failed... and he paid the price.
#19
Been Around Awhile
Join Date: Oct 2004
Location: Burlington Iowa
Posts: 29,971
Bikes: Vaterland and Ragazzi
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 12 Post(s)
Liked 1,534 Times
in
1,044 Posts
Seat belts are very effective when buckled, useless when not; rules makers decided that too many people did not buckle up regardless of the law, hence mandatory air bags which require no driver or passenger compliance to be effective.
#20
Senior Member
An autopilot in an airplane is one thing. But this is an example of why I don't like self-driving cars. He relied too much on it. It can be improperly programmed, just like an airplane. Sure, The purpose of the self-driving is to reduce the errors made by humans. But just like has been debated for some time in the airline industry. That there is too much reliance on the autopilot.
#21
Full Member
Join Date: Sep 2012
Posts: 398
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 214 Post(s)
Likes: 0
Liked 0 Times
in
0 Posts
But driverless technology, on the other hand is on a whole new level of AI human-like intelligence and decision-making. That's why people don't trust it and maybe never will.
#22
Senior Member
Join Date: Jun 2008
Location: Vancouver, BC
Posts: 9,201
Mentioned: 11 Post(s)
Tagged: 0 Thread(s)
Quoted: 1186 Post(s)
Liked 289 Times
in
177 Posts
Given your level of risk aversion it's probably best you stay off the roads altogether. Too many bad drivers out there who might crash into you.
#23
Full Member
Join Date: Sep 2012
Posts: 398
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 214 Post(s)
Likes: 0
Liked 0 Times
in
0 Posts
btw do you normally read the instruction manual before you buy the car? LOL that's a new one.
Driverless cars are scarier than human drivers.
Last edited by northernlights; 06-25-17 at 07:27 PM.
#24
Senior Member
Join Date: Jun 2008
Location: Vancouver, BC
Posts: 9,201
Mentioned: 11 Post(s)
Tagged: 0 Thread(s)
Quoted: 1186 Post(s)
Liked 289 Times
in
177 Posts
Tesla is the one who is hung up on the misleading name of their so-called autopilot feature, refusing to change it despite widespread criticism. In the real world not many people read the instruction manual for anything. But I see you have no problem with deceptive advertising.
btw do you normally read the instruction manual before you buy the car? LOL that's a new one.
Driverless cars are scarier than human drivers.
btw do you normally read the instruction manual before you buy the car? LOL that's a new one.
Driverless cars are scarier than human drivers.
#25
C*pt*i* Obvious