When did I say it would be instant?I think you're overlooking the cost of installing all that tech in remote areas.
Or are we all going to be confined to cities and flatland?
When did I say it would be instant?I think you're overlooking the cost of installing all that tech in remote areas.
Or are we all going to be confined to cities and flatland?
When did I say that you said it would be instant?When did I say it would be instant?
...until the system is hacked and there are simultaneous mass casualties nationwide...you don't even need stop signs anymore, the cars can negotiate among themselves how to get past each other or wait for a blockage, etc. A lot of other things beyond what I've pointed out, but our level of tech is way beyond the capabilities needed to make that kind of system work. But it's got to be a closed system.
That's not how a computer works. It has no morals. The autonomous car will attempt to avoid hitting everything.The biggest problem I see is decision making... the autonomous car is going to hit either the oncoming car approaching at a high closing rate, or the kid crossing the street, with no other option... which does the computer pick?
The autonomous car has no morals, but makes choices based on weighted risk / harm outcome rankings that are decided upon when creating the algorithms.That's not how a computer works. It has no morals. The autonomous car will attempt to avoid hitting everything.
If you're ever in Nashville, look me up. We'll go for an autonomous drive.
Sounds like pre-programmed decision trees rather than actual intelligence.The autonomous car has no morals, but makes choices based on weighted risk / harm outcome rankings that are decided upon when creating the algorithms.
Those risk rankings are generated by real people; engineers, programmers and lawyers.
One example of predetermined risk weighting is below. The algorithm determines how much separation risk to allocate to the vehicle and owner from the heavy truck, vs. increasing risk of potential harm to the cyclist in the image below.
Additional protection that the vehicle gets allocated to mitigate truck path uncertainty increases injury risk of the cyclist and vice-versa.
The “computer” will have cases where it is not possible to avoid a collision due to rapid movement of either, or an additional party, and by default will be making pre-weighted choices.
I find this to be fascinating, as people in another room far, far away are pre-determining the final destiny of others by simply adjusting a risk weight factor.
View attachment 127186
The modern version of the old “Trolley Problem”.
What about when the programing is machine learning, not hand coded, like Tesla's FSD v12?Those risk rankings are generated by real people; engineers, programmers and lawyers.
Is there any evidence that they're worse? Human drivers are really bad.Is there enough data yet to show that the safety record of autonomous vehicles equals or exceeds that of manual drivers in all of the locations where vehicles are used?
Google Maps
Find local businesses, view maps and get driving directions in Google Maps.www.google.com
Google Maps
Find local businesses, view maps and get driving directions in Google Maps.www.google.com
Tolerance is lower for a machine killing a human.Is there any evidence that they're worse? Human drivers are really bad.
Apparently. Human drivers kill people tens of thousands of times a year, and everyone considers that normal. But until Tesla solves unsolvable philosophical riddles, nobody thinks we're ready for self-driving cars.Tolerance is lower for a machine killing a human.
It's still there, but in a different way.What about when the programing is machine learning, not hand coded, like Tesla's FSD v12?
Sure, but none of those videos showed how to handle the hypothetical binary ethical decisions like whether to hit the kid or the car.Many of those needed to be sorted into "good" and "bad" behavior to set boundaries, as well as many "never do this" regarding traffic laws, etc.
I don't know. That's why I'm asking if anyone has seen comparative data.Is there any evidence that they're worse? Human drivers are really bad.
I wonder how long it would have blocked traffic if that had happened on a snowy road on the way to a ski area.I was in San Francisco this week and one broke down in front of the house…lit itself up with flashers and it’s ok pedestrian disco ball on the roof…blocked traffic for an hour and a half until a technician showed up.
On a screen? Audibly? Via calling the 800 number on the outside of the vehicle?Do they know to stop, if there is a fender bender, to share insurance info?
How would that sharing take place?
They can't drive drunk and they can't consciously ignore an unreasonable risk of causing someone's death, so unclear how one could commit manslaughter. Perhaps you could argue some negligence or recklessness by the company that made them though.There are lots of accidents where the driver drove poorly, and goes to jail.
Who goes to jail when one is involved in eg; vehicular manslaughter?
I just get grumpy when people talk about banning human drivers from the roads without thinking it through.I like discussions like this because they're often good thought experiments. But identifying difficult corner cases doesn't mean an idea is unviable even if 100% of the problems can't be solved.
It's entirely possible that there will be some roads in the future that are off limits to human drivers, like HOV lanes are restricted now. Autonomous vehicles that are talking to each other can drive much now efficiently than human-operated vehicles in certain circumstances. But it won't be everywhere. And not all restaurants will be Taco Bell.I just get grumpy when people talk about banning human drivers from the roads without thinking it through.
I thought they all become carls juniors..And not all restaurants will be Taco Bell
That certainly seems to be more the path we're on.I thought they all become carls juniors..
NotIn a few years driverless cars will be old news and you can go across town in a pilotless drone ...
Ehang 216-S
Certification info
I think this is the best question in the thread. It addresses an issue that's been around for a long time, that the software industry is almost immune from meaningful regulation or legislation. Well, I'll correct that, we DO have regulations protecting software companies, but not many the other way around.Do they know to stop, if there is a fender bender, to share insurance info?
How would that sharing take place?
There are lots of accidents where the driver drove poorly, and goes to jail.
Who goes to jail when one is involved in eg; vehicular manslaughter?
But it is ok to think about, talk about potential problems (even unlikely ones) and their solutions, yes?I like discussions like this because they're often good thought experiments. But identifying difficult corner cases doesn't mean an idea is unviable even if 100% of the problems can't be solved.
Waymo knows where stop signs are and, from what I have observed, comes to a full stop at each one, unlike human drivers. They also know the speed limit. I can see how some unusual situations would confuse them, but stop signs, stop lights, and speed limits are predictable. I've seen them around here for 4-5 years; first in the testing phase with safety drivers; then alone; and now carrying paying passengers. I haven't had the occasion to take one, but I have the app and might someday. I am actually more cautious of them as a pedestrian and bicyclist than I think I would be as a passenger.
But things like emergency vehicles, pedestrians, objects in the road, blocked lanes, etc., aren't.
View attachment 127114
Thanks, that's a lot more than I knew when I got up this morning..!!
I just get grumpy when people talk about banning human drivers from the roads without thinking it through.
We pick on Boeing for not being able to build an airplane that doesn't have parts flying off of it, yet every major software vendor I'm aware of is in a continual cycle of release products with serious flaws. The problem is so pervasive that regulations are put in place in every industry that deals with confidential data that software has to be kept "current". Meaning that entire industries are built around the profit of continued release of bad products.
As I said in your quote. It does get comical though when the peanut gallery assumes the designers and engineers who have been working with these things on actual roads for years haven't thought of obvious things like ice or emergency vehicles.But it is ok to think about, talk about potential problems (even unlikely ones) and their solutions, yes?
I think many of us who live west of the Continental Divide know that roads that driverless cars would have trouble with are less of a "corner case" than some would have us believe.Welcome to the club.
I don't doubt that they have thought of those things. However I would like to know how far along they are in testing driverless cars on roads and conditions that are not uncommon from the Rockies on westAs I said in your quote. It does get comical though when the peanut gallery assumes the designers and engineers who have been working with these things on actual roads for years haven't thought of obvious things like ice or emergency vehicles.
What some videos from the https://www.youtube.com/@WholeMars YouTube channel. He shows driving around California with his Model S with FSD v12.I think many of us who live west of the Continental Divide know that roads that driverless cars would have trouble with are less of a "corner case" than some would have us believe.
I think driving around the congested parts of San Francisco would be more difficult that driving on a mountain road (I have done both). Many more unpredictable events in the City; double-parked vehicles and clueless or agressive pedestrians, bicyclists, motorcyclists (lane splitting is legal here), and drivers, to name a few. Waymo seems to have done ok. Not perfect, but OK. Human drivers are far from perfect.I think many of us who live west of the Continental Divide know that roads that driverless cars would have trouble with are less of a "corner case" than some would have us believe.
In Turkey, (while I was stationed there), if you hire a cab and it gets into an accident, the passenger pays for the damages and compensates the victim for any loss or injury. If you kill someone the passenger goes to jail.So, if a driverless car runs a stop sign, who gets the ticket.??
I was there in the 80s and was surprised at the four door Chevy cabs from the 50s and 60s that were there:In Turkey, (while I was stationed there), if you hire a cab and it gets into an accident, the passenger pays for the damages and compensates the victim for any loss or injury. If you kill someone the passenger goes to jail.
The reason: The cab wouldn't be there if you hadn't hired it, so it's your fault.
I was a witness to this while I was there.