Go Back  Bike Forums > Bike Forums > Advocacy & Safety
Reload this Page >

Would a self driving car world make it safe for cyclists?

Notices
Advocacy & Safety Cyclists should expect and demand safe accommodation on every public road, just as do all other users. Discuss your bicycle advocacy and safety concerns here.

Would a self driving car world make it safe for cyclists?

Old 05-22-17, 09:36 AM
  #76  
Senior Member
 
Join Date: Nov 2008
Location: Bay Area, Calif.
Posts: 7,239
Likes: 0
Liked 7 Times in 6 Posts
I have confidence in the technology for two primary reasons:
1) The Google car fleet already has a better safety record than would be expected for human drivers over the same number of miles.
2) Most importantly, the technology will continue to improve. Every time there's any kind of collision or even a close call it is logged by the software and examined to see what went wrong and how things can be changed to prevent a recurrence. Over time that will result in steady improvements in both the software and hardware. Unlike human drivers, autonomous cars will not continue to make the same mistakes repeatedly. Instead the fleet of vehicles will learn from any mistakes made by any of the vehicles and not repeat them in the future.
prathmann is offline  
Old 05-22-17, 12:48 PM
  #77  
Been Around Awhile
 
I-Like-To-Bike's Avatar
 
Join Date: Oct 2004
Location: Burlington Iowa
Posts: 30,035

Bikes: Vaterland and Ragazzi

Liked 1,598 Times in 1,079 Posts
Originally Posted by Daniel4
That's the human factor. There are no accidents. Please read PoorBob's post.
I read it and it provides no clue how the risk or chance of an incident will be reduced by a major reduction in percentage, given that the technology does not yet exist to assure that all driver-less cars will be able to transport people and react reliably and safely in real time to the millions of different combinations of destinations, variable road, traffic density, vehicle conditions and weather condition, as well as unpredictable events (potholes, tree branches, animals and pedestrians, children) that will be exposed to in varying and unpredictable sequence once they leave the tightly controlled laboratory like environment that provides such dreamy safety predictions; predictions based mostly on wishful thinking.
I-Like-To-Bike is offline  
Old 05-22-17, 07:54 PM
  #78  
Senior Member
 
Join Date: Mar 2012
Location: Tallahassee, FL
Posts: 4,816
Likes: 0
Liked 1,027 Times in 576 Posts
Originally Posted by I-Like-To-Bike
"Function as desired" is a tall order if a principal parameter of "desire" includes significant reduction in injury/collision risk with no loss of personal mobility as being promoted by the salesmen/hypsters for the technology.
The technology is good enough now that if you replaced the entire US fleet with autonomous cars tomorrow, there would be a reduction in collisions. In a decade, it will be much better. The difficulty will lie in cost and culture.
jon c. is offline  
Old 05-23-17, 06:03 AM
  #79  
Junior Member
 
Join Date: May 2017
Location: PGH
Posts: 91
Liked 9 Times in 7 Posts
I-Like-To-Bike,

Nothing said in a forum can be taken as fact, we are all basically regurgitating opinion. There are some great E articles etc on the subject. You will probably get a better understanding of what technology goes into these vehicles and how they are being tested/deployed.

City streets in Pittsburgh are far from a nice lab environment.


While listening to a Pod Cast on this tech they actually talked about the Uber car encountering a bicycle at an intersection. The cyclist was in a track stand at the red light yet was still moving slightly. The Uber car refused to move making a turn in-front of the cyclist because it was not certain the moving object(cyclist) would move into its path of travel. That alone is a safer act then the majority of human drivers.

Last edited by PoorBob; 05-23-17 at 06:06 AM.
PoorBob is offline  
Old 05-23-17, 07:50 AM
  #80  
Been Around Awhile
 
I-Like-To-Bike's Avatar
 
Join Date: Oct 2004
Location: Burlington Iowa
Posts: 30,035

Bikes: Vaterland and Ragazzi

Liked 1,598 Times in 1,079 Posts
Originally Posted by jon c.
The technology is good enough now that if you replaced the entire US fleet with autonomous cars tomorrow, there would be a reduction in collisions. In a decade, it will be much better. The difficulty will lie in cost and culture.
What makes you think so, besides relying on the hype and promises from the promoters of said technology? Perhaps if the "technology" consistently reacts to any event not within the expected parameters by bringing all traffic to a crawl or dead stop "just to be safe", some A&S safety prognostications may come true.
I-Like-To-Bike is offline  
Old 05-23-17, 08:00 AM
  #81  
Been Around Awhile
 
I-Like-To-Bike's Avatar
 
Join Date: Oct 2004
Location: Burlington Iowa
Posts: 30,035

Bikes: Vaterland and Ragazzi

Liked 1,598 Times in 1,079 Posts
Originally Posted by PoorBob
I-Like-To-Bike,

Nothing said in a forum can be taken as fact, we are all basically regurgitating opinion. There are some great E articles etc on the subject. You will probably get a better understanding of what technology goes into these vehicles and how they are being tested/deployed.

City streets in Pittsburgh are far from a nice lab environment.


While listening to a Pod Cast on this tech they actually talked about the Uber car encountering a bicycle at an intersection. The cyclist was in a track stand at the red light yet was still moving slightly. The Uber car refused to move making a turn in-front of the cyclist because it was not certain the moving object(cyclist) would move into its path of travel. That alone is a safer act then the majority of human drivers.
Poor Bob,
Perhaps the safer act reaction to every encounter that is not within the programing parameters (i.e. any uncertainty) will be to hit the brakes or refuse to move. Sounds like chaos will reign with a transportation system based on such programming limitations.

What is your take on the latest news about Uber and the Pittsburgh experiment?
"Pittsburgh Welcomed Uber’s Driverless Car Experiment. Not Anymore."
https://www.nytimes.com/2017/05/21/t...xperiment.html
I-Like-To-Bike is offline  
Old 05-23-17, 08:10 AM
  #82  
Senior Member
 
Join Date: Sep 2013
Location: Massachusetts
Posts: 4,530
Liked 663 Times in 443 Posts
It was a google self-driving car, at a four-way stop sign, two years ago.

Anyhow, I had hoped to see this in Amsterdam, but alas:


-mr. bill
mr_bill is offline  
Old 05-23-17, 07:39 PM
  #83  
Senior Member
 
Join Date: Mar 2012
Location: Tallahassee, FL
Posts: 4,816
Likes: 0
Liked 1,027 Times in 576 Posts
Originally Posted by I-Like-To-Bike
What makes you think so, besides relying on the hype and promises from the promoters of said technology?
I've seen enough of the science underlying the hype and promises to know how good it is now and have pretty good confidence in how good it's going to be. As I said earlier, this is broad and diverse field of research. It isn't just a small group of promoters. The capacities are amazing and very real and I have no doubt it's the future. It won't come along as quickly as some of the optimistic promoters might have us believe, but it is inevitable.
jon c. is offline  
Old 05-23-17, 07:57 PM
  #84  
Randomhead
 
Join Date: Aug 2008
Location: Happy Valley, Pennsylvania
Posts: 24,513
Liked 3,807 Times in 2,595 Posts
Originally Posted by prathmann
I have confidence in the technology for two primary reasons:
1) The Google car fleet already has a better safety record than would be expected for human drivers over the same number of miles.
google pretty much said it's not going to happen any time soon. They are the most responsible players in this field, and they have the most experience. I don't see robotics tech advancing nearly as fast as hubris. If we proceed without due caution, driverless cars will start killing cyclists, the cyclists will be banned. Jaywalking laws on steroids.
unterhausen is offline  
Old 05-24-17, 12:54 PM
  #85  
Full Member
 
northernlights's Avatar
 
Join Date: Sep 2012
Posts: 398
Likes: 0
Liked 0 Times in 0 Posts
I think self-driving cars will go the way of flying cars.

Flying cars exist but the likelihood of seeing them on a mass commercial scale is not very high.
northernlights is offline  
Old 05-24-17, 01:08 PM
  #86  
Banned
 
Join Date: Jun 2010
Location: NW,Oregon Coast
Posts: 43,598

Bikes: 8

Liked 1,360 Times in 866 Posts
Cars now have detectors in them, but they cost more than , my parents spent on buying a house.

(OK we are talking 70+ years ago.. )
fietsbob is offline  
Old 05-24-17, 01:31 PM
  #87  
Senior Member
 
KD5NRH's Avatar
 
Join Date: Jul 2010
Location: Stephenville TX
Posts: 3,697

Bikes: 2010 Trek 7100

Likes: 0
Liked 3 Times in 3 Posts
Originally Posted by northernlights
I think self-driving cars will go the way of flying cars.
Maybe, but not for the same reasons; flying cars aren't that hard to do, but if driving tests bore even a passing resemblance to the level of testing for a pilot's license, there would be a lot more bicycles on the road from the 90+% of the public who would never pass.
KD5NRH is offline  
Old 05-24-17, 04:52 PM
  #88  
Full Member
 
northernlights's Avatar
 
Join Date: Sep 2012
Posts: 398
Likes: 0
Liked 0 Times in 0 Posts
Originally Posted by mtb_addict
I was thinking how can automous car read street sign?

If automous car cannot pass the vision test at the DMV...

What if there's construction or traffic light malfunction and cops put out signs to reduce speed and stop at intersection? How can automous understand a sign?
I wonder if they are able to read a speed limit sign? Or a street sweeping sign? Or a no parking tow-away sign?

Can they recognize a fire hydrant or red painted curb, so they know where not to park? What about a dip in the road? Can they recognize a road construction zone or accident scene?

There are so many questions. Seems self-driving cars will require very human-like intelligence just to do what we human drivers take for granted.
northernlights is offline  
Old 05-24-17, 07:16 PM
  #89  
Junior Member
 
Join Date: May 2017
Location: PGH
Posts: 91
Liked 9 Times in 7 Posts
Try not to think of the computer acting like a human driver. The process is different, they are not replacing the human with a android driver. The car has multiple systems to tell it where to go. These vehicles do not rely solely on optical information for stop signs pedestrians etc.

Also do leave out the governments part. Each year a few new safety items are required in production vehicles, over the years they will keep pushing us further away from the wheel. You see this in new car TV adds, stopping assist, lane control, etc.

The autonomous car world inst going to happen over night.
PoorBob is offline  
Old 05-24-17, 07:20 PM
  #90  
Junior Member
 
Join Date: May 2017
Location: PGH
Posts: 91
Liked 9 Times in 7 Posts
If you are looking for some more info on the systems etc check out Tesla

"All Tesla vehicles produced in our factory, including Model 3, have the hardware needed for full self-driving capability at a safety level substantially greater than that of a human driver."

so there you have it, the tech is already in the car today in 2017... That last part, well ok the manufacturer is always going to say something like that...

Last edited by PoorBob; 05-24-17 at 07:49 PM.
PoorBob is offline  
Old 05-24-17, 08:58 PM
  #91  
Senior Member
 
Join Date: Jul 2013
Location: Toronto
Posts: 3,501

Bikes: Sekine 1979 ten speed racer

Liked 639 Times in 437 Posts
Originally Posted by northernlights
I wonder if they are able to read a speed limit sign? Or a street sweeping sign? Or a no parking tow-away sign?

Can they recognize a fire hydrant or red painted curb, so they know where not to park? What about a dip in the road? Can they recognize a road construction zone or accident scene?

There are so many questions. Seems self-driving cars will require very human-like intelligence just to do what we human drivers take for granted.
How will it be any worse than today when human driven cars can't recognize any of these either?

Programming these decisions will remove the human-factor of judgement in which every driver in a collision thought his judgement was better than everybody else's, including the law.
Daniel4 is offline  
Old 05-24-17, 09:24 PM
  #92  
Full Member
 
northernlights's Avatar
 
Join Date: Sep 2012
Posts: 398
Likes: 0
Liked 0 Times in 0 Posts
Originally Posted by Daniel4

Programming these decisions will remove the human-factor of judgement in which every driver in a collision thought his judgement was better than everybody else's, including the law.
That's the theory. But there have been numerous accidents and mishaps with self-driving cars as well so they are not perfect. I have seen videos of self-driving cars running red lights, hitting other cars and objects on the road. Whether they can be improved and adopted on a large scale to be safer than human drivers at reasonable cost, remains to be seen.

There are tremendous challenges to overcome. Self-driving technology will make cars much more complex than they already are, many more things on the car that can break down. What are the costs of maintenance and repairs to fix such complex systems when they break down? I don't even want to think about it.
northernlights is offline  
Old 05-25-17, 10:44 AM
  #93  
Senior Member
 
KD5NRH's Avatar
 
Join Date: Jul 2010
Location: Stephenville TX
Posts: 3,697

Bikes: 2010 Trek 7100

Likes: 0
Liked 3 Times in 3 Posts
Originally Posted by PoorBob
"All Tesla vehicles produced in our factory, including Model 3, have the hardware needed for full self-driving capability at a safety level substantially greater than that of a human driver."

so there you have it, the tech is already in the car today in 2017...
Big difference between having the hardware and having the software to make it work right.
KD5NRH is offline  
Old 05-25-17, 11:30 AM
  #94  
Senior Member
 
Join Date: Jul 2013
Location: Toronto
Posts: 3,501

Bikes: Sekine 1979 ten speed racer

Liked 639 Times in 437 Posts
Originally Posted by northernlights
That's the theory. But there have been numerous accidents and mishaps with self-driving cars as well so they are not perfect. I have seen videos of self-driving cars running red lights, hitting other cars and objects on the road. Whether they can be improved and adopted on a large scale to be safer than human drivers at reasonable cost, remains to be seen.

There are tremendous challenges to overcome. Self-driving technology will make cars much more complex than they already are, many more things on the car that can break down. What are the costs of maintenance and repairs to fix such complex systems when they break down? I don't even want to think about it.
I have seen videos with a room full of mini autonomous-robots all navigating around each other and not a collision to be had.

Even with those numerous accidents and mishaps with self-driving cars that you cite are during the test phase of a new technology. Nasa had many failures and explosions before they got it right. And even so, every mission is still dangerous.

You have heard the most dangerous leg of a flight is the drive home. And yes, we are all aware of air disasters - but people still fly.

So when self-driving cars meet the high standards that everyone on this thread demands, how do you think they would compare against the daily collisions and fatalities that human-driven vehicles cause today. And I may add, human-driven vehicles are not new technologies anymore.

How many drivers today still follow or remember all five principles of driving?

Last edited by Daniel4; 05-25-17 at 11:38 AM.
Daniel4 is offline  
Old 05-25-17, 11:31 AM
  #95  
genec
 
genec's Avatar
 
Join Date: Sep 2004
Location: West Coast
Posts: 27,079

Bikes: custom built, sannino, beachbike, giant trance x2

Liked 4,532 Times in 3,158 Posts
Originally Posted by northernlights
That's the theory. But there have been numerous accidents and mishaps with self-driving cars as well so they are not perfect. I have seen videos of self-driving cars running red lights, hitting other cars and objects on the road. Whether they can be improved and adopted on a large scale to be safer than human drivers at reasonable cost, remains to be seen.

There are tremendous challenges to overcome. Self-driving technology will make cars much more complex than they already are, many more things on the car that can break down. What are the costs of maintenance and repairs to fix such complex systems when they break down? I don't even want to think about it.
I believe you'll find that the "numerous accidents" with "self driving cars" are either when humans are actually driving, or the vehicle itself was not actually a true self driving vehicle... as in the case with the Tesla that hit the truck... that was not a full self driving mode, and it said it was only a driver assist mode in the instruction manual

Most of the collisions have been either due to humans driving the cars in non self driving mode or the cars were hit by drivers of other vehicles who themselves were in violation of the law.

The actual collision record of self driving cars, in self driving mode is quite good... vastly better than human drivers with that many miles.

"We just got rear-ended again yesterday while stopped at a stoplight in Mountain View. That's two incidents just in the last week where a driver rear-ended us while we were completely stopped at a light! So that brings the tally to 13 minor fender-benders in more than 1.8 million miles of autonomous and manual driving — and still, not once was the self-driving car the cause of the accident.”
Google's Self-Driving Cars Are Ridiculously Safe | Big Think

ut how safe is it for people to ride in autonomous vehicles? The qualitative answer is “pretty darn safe.” Waymo has logged over two million miles on U.S. streets and has only had fault in one accident, making its cars by far the lowest at-fault rate of any driver class on the road— about 10 times lower than our safest demographic of human drivers (60–69 year-olds) and 40 times lower than new drivers, not to mention the obvious benefits gained from eliminating drunk drivers.
How Safe Are Self-Driving Cars? | HuffPost

However, Waymo’s vehicles have a knack for getting hit by human drivers. When we look at total accidents (at fault and not), the Waymo accident rate is higher than the accident rate of most experienced drivers (Figure 1). Most of these accidents are fender-benders caused by humans, with no fatalities or serious injuries. The leading theory is that Waymo’s vehicles adhere to the letter of traffic law, leading them to brake for things they are legally supposed to brake for (e.g., pedestrians approaching crosswalks). Since human drivers are not used to this lawful behavior, it leads to a higher rate of rear-end collisions (where the human driver is at-fault).
Hence the problem with self driving cars, seems to be the human driven vehicles that surround them...

And as time goes on, the self driving cars become smarter and smarter about various driving situations... but each new generation of human drivers starts out as vastly inexperienced and just plain dumb about driving situations.

"Can self-driving cars ever really be safe?"
The short answer is "No." Self-driving cars can never really be safe. They will be safer! So much safer that it's worth a few minutes to understand why.

First and foremost, according to the National Highway Traffic Safety Administration (NHTSA), 90% of all traffic accidents can be blamed on human error. Next, according to the AAA Foundation for Traffic Safety, nearly 80% of drivers expressed significant anger, aggression, or road rage behind the wheel at least once in the past year. Alcohol-impaired driving fatalities accounted for 29% of the total vehicle traffic fatalities in 2015. And, finally, of the roughly 35,000 annual traffic fatalities, approximately 10% (3,477 lives in 2015) are caused by distracted driving.
Remove human error from driving, and you will not only save a significant number of lives, you will also dramatically reduce the number of serious injuries associated with traffic accidents.
Can Self-Driving Cars Ever Really Be Safe? | DigitalNext - AdAge

So bottom line... yes, google self driving cars have been in a few collisions... statistically higher numbers than human drivers for the mileage... but overwhelmingly due to being hit by human drivers that were NOT obeying the laws. Tesla cars have killed two drivers... but none of the Tesla cars are true self driving cars.

So if you believe self driving cars are not safer than human driven cars... please cite data, and reports indicating this. And no, self driving cars are not perfect, and will never be... but then humans are far far from perfect too.
genec is offline  
Old 05-25-17, 11:34 AM
  #96  
genec
 
genec's Avatar
 
Join Date: Sep 2004
Location: West Coast
Posts: 27,079

Bikes: custom built, sannino, beachbike, giant trance x2

Liked 4,532 Times in 3,158 Posts
Originally Posted by Daniel4
I have seen videos with a room full of mini-robots all navigating around each other and not a collision to be had.

Even with those numerous accidents and mishaps with self-driving cars that you cite are during the test phase of a new technology. Nasa had many failures and explosions before they got it right. And even so, every mission is still dangerous.

So when self-driving cars meet the high standards that everyone on this thread demands, how do you think they would compare against the daily collisions and fatalities that human-driven vehicles cause today. And I may add, human-driven vehicles are not new technologies anymore.

How many drivers today still follow or remember all five principles of driving?
Hell, how many drivers today still use the "2 second rule..."
genec is offline  
Old 05-25-17, 03:30 PM
  #97  
Full Member
 
northernlights's Avatar
 
Join Date: Sep 2012
Posts: 398
Likes: 0
Liked 0 Times in 0 Posts
Originally Posted by genec
I believe you'll find that the "numerous accidents" with "self driving cars" are either when humans are actually driving, or the vehicle itself was not actually a true self driving vehicle... as in the case with the Tesla that hit the truck... that was not a full self driving mode, and it said it was only a driver assist mode in the instruction manual
The Google cars have a backup human driver ready to take control of the car when something goes wrong. That is not a true self-driving vehicle. How many self-driving car accidents were averted when the human driver took over? And does Google record and count these incidents in their stats? If not then the stats would be quite misleading.

If driverless cars require humans to constantly babysit them in this manner they are not any safer than human drivers and are not really driverless.
northernlights is offline  
Old 05-25-17, 04:04 PM
  #98  
genec
 
genec's Avatar
 
Join Date: Sep 2004
Location: West Coast
Posts: 27,079

Bikes: custom built, sannino, beachbike, giant trance x2

Liked 4,532 Times in 3,158 Posts
Originally Posted by northernlights
The Google cars have a backup human driver ready to take control of the car when something goes wrong. That is not a true self-driving vehicle. How many self-driving car accidents were averted when the human driver took over? And does Google record and count these incidents in their stats? If not then the stats would be quite misleading.

If driverless cars require humans to constantly babysit them in this manner they are not any safer than human drivers and are not really driverless.
The driverless cars that are on the road today are first and foremost prototypes... You cannot buy or lease one yourself... So yes, there is a tech that rides along... The technology is NOT there today to let someone just jump in and take off.

Second, in every state where self driving cars are legal, there is a law that requires someone be able to grab the wheel at any time. So no, you cannot actually have a self driving car take you somewhere on it's own.

When the automobile was first introduced to the public, it was required that someone walk in front of it and warn the public that a "horseless carriage" was approaching... we have pretty much gotten past that law at this point, and I suspect that in the next 10 years or so, the laws requiring a driverless car have a babysitter, will also be overturned.

Meanwhile the cars (with a tech) ARE driving around in several test cities... and they do not cause collisions... (I believe one actual self driving car did collide with a bus in a dispute over lane use) Meanwhile, the human controlled vehicles kill about 35000-40000 people a year.
genec is offline  
Old 05-25-17, 04:57 PM
  #99  
Senior Member
 
Join Date: Jul 2005
Posts: 6,053
Liked 771 Times in 539 Posts
The area in which self-driving cars are not ready for prime time concerns the fuzzy logic of interpretation of ambiguities that human consciousness remains FAR superior at converting into a logical sequence of actions. The mechanics of piloting a driverless vehicle, maneuvering it around obstacles, reacting appropriately to developments are SOLVED. The Tesla car was fully self-driving. They claim it wasn't to protect their assets. Had they used a patented LIDAR array in the sensor suite, the vehicle would have been well able to discern the white truck in the low contrast situation. They have learned a valuable lesson. Cheap out at your peril! LIDAR will be mandatory on all driverless vehicles going forward, and they are now better than 85% of human drivers, and can only improve from there.

But they are terrible at interacting with the homeless guy at the end of the highway overpass and getting meaningful directions when what is in the navquest database fails to match up with reality. There is no earthly way a human driver can intercede in an accident situation in any meaningful way. Accidents happen in microseconds. The best human reaction times are on the order of .5 second. 500 milliseconds. An eternity to an AI, even a primitive one. Laughable to think a human has any advantage where speed of execution of an evasive action is called for. What hubris.

Navy pilots are forbidden to put their $5M dollar aircraft down on carrier decks themselves. As the fighter comes in on final approach the pilot must show both hands in the air to the forward observer in the control tower. They are disciplined if they fail to comply. If the US Navy trusts autonomous systems to land multi-million dollar fighter aircraft in rolling seas on tiny aircraft carrier landing strips, I think we can stop agonizing about the potential for autonomous cars to deal with stop and go traffic!
Leisesturm is offline  
Old 05-25-17, 05:32 PM
  #100  
Senior Member
 
Join Date: Sep 2013
Location: Massachusetts
Posts: 4,530
Liked 663 Times in 443 Posts
Oopsies.... Perhaps Marketeering 101 is needed.

-mr. bill
mr_bill is offline  

Thread Tools
Search this Thread

Contact Us - Archive - Advertising - Cookie Policy - Privacy Statement - Terms of Service - Your Privacy Choices -

Copyright © 2024 MH Sub I, LLC dba Internet Brands. All rights reserved. Use of this site indicates your consent to the Terms of Use.