Go Back  Bike Forums > Bike Forums > Advocacy & Safety
Reload this Page >

Advocate of self-driving car killed because he ignored warnings to take control

Search
Notices
Advocacy & Safety Cyclists should expect and demand safe accommodation on every public road, just as do all other users. Discuss your bicycle advocacy and safety concerns here.

Advocate of self-driving car killed because he ignored warnings to take control

Thread Tools
 
Search this Thread
 
Old 06-25-17, 06:04 AM
  #1  
Senior Member
Thread Starter
 
Join Date: Nov 2015
Location: Washington Grove, Maryland
Posts: 1,466

Bikes: 2003 (24)20-Speed Specialized Allez'

Mentioned: 2 Post(s)
Tagged: 0 Thread(s)
Quoted: 396 Post(s)
Liked 6 Times in 6 Posts
Advocate of self-driving car killed because he ignored warnings to take control

https://money.cnn.com/2017/06/20/tech...ebar_expansion
Chris0516 is offline  
Old 06-25-17, 11:53 AM
  #2  
Full Member
 
northernlights's Avatar
 
Join Date: Sep 2012
Posts: 398
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 214 Post(s)
Likes: 0
Liked 0 Times in 0 Posts
An earlier investigation by the National Highway Traffic Safety Administration found that the crash was not the result of any defect in Tesla's autopilot feature, which can keep a car in a lane and brake to avoid traffic and other obstacles.
Not defective? The autopilot feature is supposed to brake to avoid collision.

If the Tesla failed to brake as a very large semi truck turned directly in its path, then the autopilot feature failed to do its job hence it was defective.
northernlights is offline  
Old 06-25-17, 12:37 PM
  #3  
Senior Member
 
alcjphil's Avatar
 
Join Date: Jun 2002
Location: Montreal, Quebec
Posts: 5,925
Mentioned: 16 Post(s)
Tagged: 0 Thread(s)
Quoted: 1818 Post(s)
Liked 1,693 Times in 974 Posts
This was not a self driving car. This was a car that had driving aids that would allow the owner to have the car help drive for very limited amounts of time. The "driver" chose to stretch those limits beyond what the car was designed to do. What this has to do with the car's ability to be safe around cyclists is hard to understand, and it is a very poor example of anything to do with self driving cars, which this wasn't
alcjphil is offline  
Old 06-25-17, 12:48 PM
  #4  
Me duelen las nalgas
 
canklecat's Avatar
 
Join Date: Aug 2015
Location: Texas
Posts: 13,513

Bikes: Centurion Ironman, Trek 5900, Univega Via Carisma, Globe Carmel

Mentioned: 199 Post(s)
Tagged: 0 Thread(s)
Quoted: 4559 Post(s)
Liked 2,802 Times in 1,800 Posts
Not a self driving car. Closer to using cruise control inappropriately, such as in heavy traffic.
canklecat is offline  
Old 06-25-17, 01:35 PM
  #5  
Senior Member
Thread Starter
 
Join Date: Nov 2015
Location: Washington Grove, Maryland
Posts: 1,466

Bikes: 2003 (24)20-Speed Specialized Allez'

Mentioned: 2 Post(s)
Tagged: 0 Thread(s)
Quoted: 396 Post(s)
Liked 6 Times in 6 Posts
Originally Posted by northernlights
An earlier investigation by the National Highway Traffic Safety Administration found that the crash was not the result of any defect in Tesla's autopilot feature, which can keep a car in a lane and brake to avoid traffic and other obstacles.
Not defective? The autopilot feature is supposed to brake to avoid collision.

If the Tesla failed to brake as a very large semi truck turned directly in its path, then the autopilot feature failed to do its job hence it was defective.
An autopilot in an airplane is one thing. But this is an example of why I don't like self-driving cars. He relied too much on it. It can be improperly programmed, just like an airplane. Sure, The purpose of the self-driving is to reduce the errors made by humans. But just like has been debated for some time in the airline industry. That there is too much reliance on the autopilot.
Chris0516 is offline  
Old 06-25-17, 01:58 PM
  #6  
genec
 
genec's Avatar
 
Join Date: Sep 2004
Location: West Coast
Posts: 27,079

Bikes: custom built, sannino, beachbike, giant trance x2

Mentioned: 86 Post(s)
Tagged: 0 Thread(s)
Quoted: 13658 Post(s)
Liked 4,532 Times in 3,158 Posts
Originally Posted by Chris0516
An autopilot in an airplane is one thing. But this is an example of why I don't like self-driving cars. He relied too much on it. It can be improperly programmed, just like an airplane. Sure, The purpose of the self-driving is to reduce the errors made by humans. But just like has been debated for some time in the airline industry. That there is too much reliance on the autopilot.
Except for one thing... there are no self driving cars available to the public, yet. He was relying on a technology that has all sorts of caveats and warnings... the damn system told him to get back to driving and he ignored it.

When the fire alarm goes off in your home, do you just stand there and let the flames sweep over you? That is essentially what he did. He ignored the manual, he ignored the warnings and he paid the price.

BTW, the autopilot in an airplane (non military) does the same thing as the autopilot in a boat... it keeps you on the same course and heading... it doesn't look for obstacles, it doesn't avoid collisions, it doesn't see, it doesn't even know the terrain... Yes, it will warn you if radar detects ground or an object... but YOU still have to take command. It really is not an autoPILOT. It is a steering wheel holder... that's about it.

Until recently using a marine GPS could plow you right into land... the darn things displayed land, but had no "awareness" of it. You could pick a destination and the GPS would route you right into land. Only recently have such marine systems been designed to actually determine if there is land in the way of a boat... and at that, such systems tend to warn, not actually steer.

So like self driving cars, real "autopilots" do not yet exist.

Bottom line... keep your hands on the wheel and your eyes on the road.
genec is offline  
Old 06-25-17, 02:00 PM
  #7  
Senior Member
 
CliffordK's Avatar
 
Join Date: Nov 2014
Location: Eugene, Oregon, USA
Posts: 27,547
Mentioned: 217 Post(s)
Tagged: 0 Thread(s)
Quoted: 18372 Post(s)
Liked 4,507 Times in 3,350 Posts
Originally Posted by northernlights
An earlier investigation by the National Highway Traffic Safety Administration found that the crash was not the result of any defect in Tesla's autopilot feature, which can keep a car in a lane and brake to avoid traffic and other obstacles.
Not defective? The autopilot feature is supposed to brake to avoid collision.

If the Tesla failed to brake as a very large semi truck turned directly in its path, then the autopilot feature failed to do its job hence it was defective.
You cropped out a little from the description.

In this case, the car hit the trailer of a truck that pulled across its lanes of traffic at an intersection.
In a sense, that actually sounds worse.

To be safe in traffic, one has to have a bit of a predictive ability. It is not good enough to say that one's lane is clear at the moment. Rather, one needs to predict that it will remain clear when one arrives at a spot in the future. I.E. watch for cross traffic, and vehicles that look like they're going to pull in front of a person.

Bicycles swerving due to obstacles?

But, it is also a sign of why "self driving" will take more time to mature.
CliffordK is offline  
Old 06-25-17, 02:21 PM
  #8  
Full Member
 
northernlights's Avatar
 
Join Date: Sep 2012
Posts: 398
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 214 Post(s)
Likes: 0
Liked 0 Times in 0 Posts
Originally Posted by CliffordK
You cropped out a little from the description.
That's what I said. I stated in my previous post "a very large semi truck turned directly in its path."
Which was what happened. The truck was making a left turn at an intersection in the path of the Tesla.

Tesla has stated "neither autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied."

I will give Tesla credit for admitting their autopilot feature failed to do its job because it could not tell the difference between a white truck and the daytime sky. To me that is a defect. I'm not sure why the NHTSA would say it wasn't a defect when Tesla themselves have essentially admitted it.
northernlights is offline  
Old 06-25-17, 02:27 PM
  #9  
Senior Member
 
Join Date: Jun 2008
Location: Vancouver, BC
Posts: 9,201
Mentioned: 11 Post(s)
Tagged: 0 Thread(s)
Quoted: 1186 Post(s)
Liked 289 Times in 177 Posts
Originally Posted by northernlights
I will give Tesla credit for admitting their autopilot feature failed to do its job because it could not tell the difference between a white truck and the daytime sky. To me that is a defect. I'm not sure why the NHTSA would say it wasn't a defect when Tesla themselves have essentially admitted it.
It's not a defect because the Tesla is not a self-driving car. They don't sell it or promote it as a self driving car. It requires an alert driver to be available in the driver's seat at all times. That's why the NHTSA concluded it wasn't a defect.
gregf83 is offline  
Old 06-25-17, 02:29 PM
  #10  
genec
 
genec's Avatar
 
Join Date: Sep 2004
Location: West Coast
Posts: 27,079

Bikes: custom built, sannino, beachbike, giant trance x2

Mentioned: 86 Post(s)
Tagged: 0 Thread(s)
Quoted: 13658 Post(s)
Liked 4,532 Times in 3,158 Posts
Originally Posted by northernlights
That's what I said. I stated in my previous post "a very large semi truck turned directly in its path."
Which was what happened. The truck was making a left turn at an intersection in the path of the Tesla.

Tesla has stated "neither autopilot nor the driver noticed the white sky of the tractor trailer against a brightly lit sky, so the brake was not applied."

I will give Tesla credit for admitting their autopilot feature failed to do its job because it could not tell the difference between a white truck and the daytime sky. To me that is a defect. I'm not sure why the NHTSA would say it wasn't a defect when Tesla themselves have essentially admitted it.
It wasn't a defect because the feature was not designed to be able to see and PREDICT that well... the technology wasn't in place... and the driver was warned.

That the human also failed to notice and PREDICT the large truck crossing is indicative of how complicated this stuff can really be. I believe the findings also indicated that the Tesla was driving over the posted speed for the area, and that the truck did not actually have clearance to cross, even at the posted speed... and that the truck driver was relying a bit on the kindness of strangers. All in all, a recipe for disaster. But hey, let's blame the car, eh?
genec is offline  
Old 06-25-17, 02:36 PM
  #11  
Full Member
 
northernlights's Avatar
 
Join Date: Sep 2012
Posts: 398
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 214 Post(s)
Likes: 0
Liked 0 Times in 0 Posts
Originally Posted by gregf83
It's not a defect because the Tesla is not a self-driving car. They don't sell it or promote it as a self driving car. It requires an alert driver to be available in the driver's seat at all times. That's why the NHTSA concluded it wasn't a defect.
Then it shouldn't be called autopilot should it?

Many cars have automatic braking, but are not self-driving. Being self-driving has nothing to do with it.
When the automatic braking feature fails to work, that is the driver's fault? Automatic braking is supposed to kick in during emergency situations, which by definition doesn't give enough time for a human driver to react.

Tesla admitted it didn't work as it should. If they can't get even get the car to brake on its own what hope is there for a 100% self-driving car?
northernlights is offline  
Old 06-25-17, 03:21 PM
  #12  
genec
 
genec's Avatar
 
Join Date: Sep 2004
Location: West Coast
Posts: 27,079

Bikes: custom built, sannino, beachbike, giant trance x2

Mentioned: 86 Post(s)
Tagged: 0 Thread(s)
Quoted: 13658 Post(s)
Liked 4,532 Times in 3,158 Posts
Originally Posted by northernlights
Then it shouldn't be called autopilot should it?

Many cars have automatic braking, but are not self-driving. Being self-driving has nothing to do with it.
When the automatic braking feature fails to work, that is the driver's fault? Automatic braking is supposed to kick in during emergency situations, which by definition doesn't give enough time for a human driver to react.

Tesla admitted it didn't work as it should. If they can't get even get the car to brake on its own what hope is there for a 100% self-driving car?
No, it should not be called autopilot... and Tesla has been criticized for that name.

Autobraking as you describe it, is not available... AEB (automatic Emergency braking) is... and if it fails, then you are right back where you were with a human driver... you are gonna crash. Don't drive like you depend on this feature.

Bottom line, a self driving car only has to be a bit better than a human, but will NEVER be 100% perfect... but then, neither are humans.
genec is offline  
Old 06-25-17, 03:33 PM
  #13  
Full Member
 
northernlights's Avatar
 
Join Date: Sep 2012
Posts: 398
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 214 Post(s)
Likes: 0
Liked 0 Times in 0 Posts
Originally Posted by genec
No, it should not be called autopilot... and Tesla has been criticized for that name.

Autobraking as you describe it, is not available... AEB (automatic Emergency braking) is... and if it fails, then you are right back where you were with a human driver... you are gonna crash. Don't drive like you depend on this feature.
Is that what they're going to say when a fully self-driving car drives you off the cliff? You shouldn't depend on it?
What the hell is the point of a self driving car or, even just emergency auto braking if you can't depend on it? Its absurd.

I don't care for self-driving cars and would never own even if it were free. The problem is these self-driving or partially self-driving cars aren't just putting their owners lives at risk they are putting everyone on the road at risk.

I never consented to be a guinea pig for this hairbrain idea.
northernlights is offline  
Old 06-25-17, 03:46 PM
  #14  
Senior Member
 
CliffordK's Avatar
 
Join Date: Nov 2014
Location: Eugene, Oregon, USA
Posts: 27,547
Mentioned: 217 Post(s)
Tagged: 0 Thread(s)
Quoted: 18372 Post(s)
Liked 4,507 Times in 3,350 Posts
Originally Posted by northernlights
That's what I said. I stated in my previous post "a very large semi truck turned directly in its path."
Which was what happened. The truck was making a left turn at an intersection in the path of the Tesla.

Tesla has stated "neither autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied."
If the truck was moving, then that broad side of the trailer should have been preceded by the side of the cab. Also invisible? How much delay between cab and trailer and impact?

If the trailer was being backed up, then it would likely have been an industrial neighborhood, and speeds still should have been reduced.

Originally Posted by genec
I believe the findings also indicated that the Tesla was driving over the posted speed for the area, and that the truck did not actually have clearance to cross, even at the posted speed... and that the truck driver was relying a bit on the kindness of strangers. All in all, a recipe for disaster. But hey, let's blame the car, eh?
A large part of driving is don't run over whatever is in front of you.

On my bike, inevitably if I hit a yellow light upon entering a 4 or 5 lane road, it will be red by the time I get to the other side, and the opposing traffic will get a green. I have one left turn that I hit occasionally. If I take off as soon as the light turns green, it will turn yellow before I get a quarter way through the intersection, and red, with cross traffic getting green before I completely clear the intersection (with at least an extra second or so before traffic should arrive across to my position).

I don't want a Tesla to slam into me because it had a green light and its lane was clear moments before.
CliffordK is offline  
Old 06-25-17, 03:49 PM
  #15  
genec
 
genec's Avatar
 
Join Date: Sep 2004
Location: West Coast
Posts: 27,079

Bikes: custom built, sannino, beachbike, giant trance x2

Mentioned: 86 Post(s)
Tagged: 0 Thread(s)
Quoted: 13658 Post(s)
Liked 4,532 Times in 3,158 Posts
Originally Posted by northernlights
Is that what they're going to say when a fully self-driving car drives you off the cliff? You shouldn't depend on it?
What the hell is the point of a self driving car or, even just emergency auto braking if you can't depend on it? Its absurd.

I don't care for self-driving cars and would never own even if it were free. The problem is these self-driving or partially self-driving cars aren't just putting their owners lives at risk they are putting everyone on the road at risk.

I never consented to be a guinea pig for this hairbrain idea.
I commend your logic... you went from a situation in which EMERGENCY systems fail, to fully damning FUTURE self driving technology. By your level of thinking we should have stopped flying planes the first time one crashed, we should have stopped driving cars the first time there was a collision (which BTW was back in 1896). We should have never gone to the moon after the failure of Apollo 1, and Apollo 13.

You are making suppositions of technology that isn't even there yet. Remember airbags... their early adoption killed a few people... yet they save hundreds of lives today. Do you drive like that airbag is gonna save your life? Do you depend on it?
genec is offline  
Old 06-25-17, 03:50 PM
  #16  
genec
 
genec's Avatar
 
Join Date: Sep 2004
Location: West Coast
Posts: 27,079

Bikes: custom built, sannino, beachbike, giant trance x2

Mentioned: 86 Post(s)
Tagged: 0 Thread(s)
Quoted: 13658 Post(s)
Liked 4,532 Times in 3,158 Posts
Originally Posted by CliffordK
If the truck was moving, then that broad side of the trailer should have been preceded by the side of the cab. Also invisible? How much delay between cab and trailer and impact?

If the trailer was being backed up, then it would likely have been an industrial neighborhood, and speeds still should have been reduced.



A large part of driving is don't run over whatever is in front of you.

On my bike, inevitably if I hit a yellow light upon entering a 4 or 5 lane road, it will be red by the time I get to the other side, and the opposing traffic will get a green. I have one left turn that I hit occasionally. If I take off as soon as the light turns green, it will turn yellow before I get a quarter way through the intersection, and red, with cross traffic getting green before I completely clear the intersection (with at least an extra second or so before traffic should arrive across to my position).

I don't want a Tesla to slam into me because it had a green light and its lane was clear moments before.
Of course not... but it is OK for a human driver to do that, right?
genec is offline  
Old 06-25-17, 04:37 PM
  #17  
Full Member
 
northernlights's Avatar
 
Join Date: Sep 2012
Posts: 398
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 214 Post(s)
Likes: 0
Liked 0 Times in 0 Posts
Originally Posted by genec
I commend your logic... you went from a situation in which EMERGENCY systems fail, to fully damning FUTURE self driving technology. By your level of thinking we should have stopped flying planes the first time one crashed, we should have stopped driving cars the first time there was a collision (which BTW was back in 1896). We should have never gone to the moon after the failure of Apollo 1, and Apollo 13.

You are making suppositions of technology that isn't even there yet. Remember airbags... their early adoption killed a few people... yet they save hundreds of lives today. Do you drive like that airbag is gonna save your life? Do you depend on it?
1. An airbag isn't nearly as complex as a self-driving car.

2. Even if the airbag doesn't deploy for some reason, your seatbelt will probably save you.

3. Airbags still kill people today. Ever heard of the recent massive Takata airbag recalls?
What hope is there for driverless cars when they can't even get an airbag to work right, which is far simpler than a driverless car?
northernlights is offline  
Old 06-25-17, 05:07 PM
  #18  
genec
 
genec's Avatar
 
Join Date: Sep 2004
Location: West Coast
Posts: 27,079

Bikes: custom built, sannino, beachbike, giant trance x2

Mentioned: 86 Post(s)
Tagged: 0 Thread(s)
Quoted: 13658 Post(s)
Liked 4,532 Times in 3,158 Posts
Originally Posted by northernlights
1. An airbag isn't nearly as complex as a self-driving car.

2. Even if the airbag doesn't deploy for some reason, your seatbelt will probably save you.

3. Airbags still kill people today. Ever heard of the recent massive Takata airbag recalls?
What hope is there for driverless cars when they can't even get an airbag to work right, which is far simpler than a driverless car?
I noticed you avoided the question... Do you drive like that airbag is gonna save your life? Do you depend on it?

Also, if seatbelts were so effective, why mandated airbags?

Yeah the failures you mentioned in #3 were from a manufacture taking short cuts... they filed for bankruptcy in Japan right now... Air bag maker Takata bankruptcy filing expected in Japan, US - StarTribune.com

Go ahead and hate future technology... you won't be alone... throughout history more has been promised than delivered, we still don't have flying cars... and some folks just fear technological change. Ever hear of Luddites?
What the Luddites Really Fought Against | History | Smithsonian

Read it and commiserate... you want technology that works, and works right.

Of course bear in mind that the driver of that Tesla... he pushed technology... until it failed... and he paid the price.
genec is offline  
Old 06-25-17, 05:20 PM
  #19  
Been Around Awhile
 
I-Like-To-Bike's Avatar
 
Join Date: Oct 2004
Location: Burlington Iowa
Posts: 29,971

Bikes: Vaterland and Ragazzi

Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 12 Post(s)
Liked 1,534 Times in 1,044 Posts
Originally Posted by genec
Also, if seatbelts were so effective, why mandated airbags?
Seat belts are very effective when buckled, useless when not; rules makers decided that too many people did not buckle up regardless of the law, hence mandatory air bags which require no driver or passenger compliance to be effective.
I-Like-To-Bike is offline  
Old 06-25-17, 05:27 PM
  #20  
Senior Member
 
Join Date: Jul 2013
Location: Toronto
Posts: 3,501

Bikes: Sekine 1979 ten speed racer

Mentioned: 15 Post(s)
Tagged: 0 Thread(s)
Quoted: 1481 Post(s)
Liked 639 Times in 437 Posts
Originally Posted by Chris0516
An autopilot in an airplane is one thing. But this is an example of why I don't like self-driving cars. He relied too much on it. It can be improperly programmed, just like an airplane. Sure, The purpose of the self-driving is to reduce the errors made by humans. But just like has been debated for some time in the airline industry. That there is too much reliance on the autopilot.
It all boils down to human error where the human still thinks he has better judgment than everybody else. So why is bad driving an acceptable norm?
Daniel4 is offline  
Old 06-25-17, 05:45 PM
  #21  
Full Member
 
northernlights's Avatar
 
Join Date: Sep 2012
Posts: 398
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 214 Post(s)
Likes: 0
Liked 0 Times in 0 Posts
Originally Posted by genec
No, it should not be called autopilot... and Tesla has been criticized for that name.
Yet they stubbornly refuse to change the name. This is a company that is not being honest with its customers, and not a company I would place whole lot of trust in, when even just the name of their products are misleading and they refuse to correct themselves.




Originally Posted by genec
I noticed you avoided the question... Do you drive like that airbag is gonna save your life? Do you depend on it?

The thing about all the other technologies that came before driverless cars (the airplane, airbags, the Apollo space ship, etc) is that they are basically "dumb." They are all great technologies, and I never had any problem with them, because they don't require much in the way of artificial intelligence, if any to function.

But driverless technology, on the other hand is on a whole new level of AI human-like intelligence and decision-making. That's why people don't trust it and maybe never will.
northernlights is offline  
Old 06-25-17, 07:05 PM
  #22  
Senior Member
 
Join Date: Jun 2008
Location: Vancouver, BC
Posts: 9,201
Mentioned: 11 Post(s)
Tagged: 0 Thread(s)
Quoted: 1186 Post(s)
Liked 289 Times in 177 Posts
Originally Posted by northernlights
Yet they stubbornly refuse to change the name. This is a company that is not being honest with its customers, and not a company I would place whole lot of trust in, when even just the name of their products are misleading and they refuse to correct themselves.
You seem pretty hung up on the name. The vast majority of Tesla owner (although perhaps not 100%) are intelligent enough to read the manual provided which explains the technology of the car and how it is to be used. It performs as expected.

Given your level of risk aversion it's probably best you stay off the roads altogether. Too many bad drivers out there who might crash into you.
gregf83 is offline  
Old 06-25-17, 07:14 PM
  #23  
Full Member
 
northernlights's Avatar
 
Join Date: Sep 2012
Posts: 398
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 214 Post(s)
Likes: 0
Liked 0 Times in 0 Posts
Originally Posted by gregf83
You seem pretty hung up on the name. The vast majority of Tesla owner (although perhaps not 100%) are intelligent enough to read the manual provided which explains the technology of the car and how it is to be used. It performs as expected.
Tesla is the one who is hung up on the misleading name of their so-called autopilot feature, refusing to change it despite widespread criticism. In the real world not many people read the instruction manual for anything. But I see you have no problem with deceptive advertising.

btw do you normally read the instruction manual before you buy the car? LOL that's a new one.


Originally Posted by gregf83
Given your level of risk aversion it's probably best you stay off the roads altogether. Too many bad drivers out there who might crash into you.
Driverless cars are scarier than human drivers.

Last edited by northernlights; 06-25-17 at 07:27 PM.
northernlights is offline  
Old 06-25-17, 08:13 PM
  #24  
Senior Member
 
Join Date: Jun 2008
Location: Vancouver, BC
Posts: 9,201
Mentioned: 11 Post(s)
Tagged: 0 Thread(s)
Quoted: 1186 Post(s)
Liked 289 Times in 177 Posts
Originally Posted by northernlights
Tesla is the one who is hung up on the misleading name of their so-called autopilot feature, refusing to change it despite widespread criticism. In the real world not many people read the instruction manual for anything. But I see you have no problem with deceptive advertising.

btw do you normally read the instruction manual before you buy the car? LOL that's a new one.




Driverless cars are scarier than human drivers.
There are over 30,000/yr fatalities in the US alone. The bar for driverless cars is very low and I'm confident the accident rate will be lower once we get rid of all the idiots who insist on driving their cars.
gregf83 is offline  
Old 06-25-17, 09:57 PM
  #25  
C*pt*i* Obvious
 
SHBR's Avatar
 
Join Date: Dec 2013
Location: Shanghai
Posts: 1,337
Mentioned: 5 Post(s)
Tagged: 0 Thread(s)
Quoted: 596 Post(s)
Liked 53 Times in 44 Posts
Originally Posted by CliffordK
To be safe in traffic, one has to have a bit of a predictive ability.
This requires a very advanced form of A.I.

If this exists, humanity doesn't have a chance.
SHBR is offline  


Contact Us - Archive - Advertising - Cookie Policy - Privacy Statement - Terms of Service -

Copyright © 2024 MH Sub I, LLC dba Internet Brands. All rights reserved. Use of this site indicates your consent to the Terms of Use.