Put a breathalizer in the self-driving car!I think the best we can come up with is a sober driver.
Put a breathalizer in the self-driving car!I think the best we can come up with is a sober driver.
This might be wishful thinking on my part, but my hope is that quite the opposite, the regulations will require a competent and sober driver to be in a position to take over the controls at any time.
Amen. Here in Vermont, I've figured about an average of about a 5 second delay between one car setting off and the next, which results in typically only 5 or 6 cars making it through a light in heavy traffic because of the way the lights are timed.I see some pluses to autonomous cars. Let's say we're sitting at a stop light in a line of 10-20 cars and the light turns green. Today, the first car goes, then the second begins to move, then the third and so and so on. This generally makes for large gaps in the line of traffic. With all of the vehicles being autonomous each car could begin moving sooner and accelerate at the same rate as the car in front of it keeping the traffic moving much better.
I do figure some of you already try to do that, but most folks don't work that way.
“You have the car”. “I have the car”. “You have the car”.This might be wishful thinking on my part, but my hope is that quite the opposite, the regulations will require a competent and sober driver to be in a position to take over the controls at any time.
Also known as ‘coercion’.I suspect a lot of the pressure to switch will be economic.
For example, pay $108/yr for liability insurance for an autonomous vehicle, vs $978 for the privilege of driving oneself.
That may be, but I don't think the technology is anywhere near mature enough at this point to completely automate driving in a way that is even reasonably safe, not to mention that will satisfy the public's desire to be absolutely 100% safe (except when they themselves are at the controls, of course), and I don't see that happening even during my lifetime. Others have already pointed out the potential problems, from systems failures to hacking to who is held responsible for the way a machine resolves what to us humans is a moral dilemma. At least I hope this technology isn't rushed into the mainstream before it's truly ready.That's not at all the direction that the technology is taking at this point and I think is also an unrealistic expectation. Maybe it'll change, but I doubt it.
More like A/P disconnect button.“You have the car”. “I have the car”. “You have the car”.
Also known as ‘coercion’.
Would a computer be able to fly to the Azores when it runs out of gas?
More like A/P disconnect button.
That may be, but I don't think the technology is anywhere near mature enough at this point to completely automate driving in a way that is even reasonably safe, not to mention that will satisfy the public's desire to be absolutely 100% safe (except when they themselves are at the controls, of course), and I don't see that happening even during my lifetime. Others have already pointed out the potential problems, from systems failures to hacking to who is held responsible for the way a machine resolves what to us humans is a moral dilemma. At least I hope this technology isn't rushed into the mainstream before it's truly ready.
In some ways, the computer MIGHT make a better decision.There are too many variables for a machine to assess in the time it would have to do so.
I put a car in a ditch about 20 years ago to avoid killing two kids on a sled. I did a quick assessment of the situation and decided in a gloriously analog way that the snow that had been plowed into the ditch would absorb most of the force; so although I might total the car, I'd almost certainly walk away from it. How does one program a computer to make those kind of decisions that don't boil down to ones and zeroes?
Rich
Or how about the Volkswagen sized tumbleweeds that come out of nowhere and blow across the many rural roads around here. How does a driverless car account for something that you normally just plow into and explode it into a pile of stems. A driverless car would have your head through the windshield every 2 minutes trying to decide if what's coming is friend or foe. I won't even get into all the mud, grit, and grime that would foul up all the sensors on one of those cars after a good rain/snow storm.How does the computer account for driving past brush that might hit the side of the truck?
The computer won't be programmed with that knowledge. It'll learn by itself about that scenario. Just like you did.
Or how about the Volkswagen sized tumbleweeds that come out of nowhere and blow across the many rural roads around here. How does a driverless car account for something that you normally just plow into and explode it into a pile of stems. A driverless car would have your head through the windshield every 2 minutes trying to decide if what's coming is friend or foe. I won't even get into all the mud, grit, and grime that would foul up all the sensors on one of those cars after a good rain/snow storm.
Me thinks the self driving technology is best left to the tractors and combines that have made so many farmers lazier and able to text and watch TV while plowing or harvesting their fields.
I notice they're not playing in Denver. Most of them are doing their testing in Phoenix.I think snow will give them fits, as well.
Not to mention, New England potholes...I think snow will give them fits, as well.
Rich
But I want to be able to still drive my vehicle and not be forced to let the technology do it for me. If the execution is like traction control (I can turn it off) then I'm mostly ok with it.
Today, the first car goes, then the second begins to move, then the third and so and so on.
Suppose that's the case. How long until the first lawsuit where it's pointed out that the defendant *deliberately* disengaged the autonomous features, knowing full well that the accident rate with him behind the wheel was 10x that of when he let the car drive itself? Would that be reckless endangerment?
Autonomous cars are pretty exciting tech, but I find the potential implications troubling.
It's not on size fits all. Just because an AI scheme is trustworthy in one application does not mean it is even viable for another. Similarly, just because one scheme validated to one standard has issues does not mean all schemes have similar issues.The point of my post is that maybe we should wait for AI to prove itself in non safety-critical applications before we trust people's lives to it.
1. If the government is "for" something that tells me to give it the hairy eyeball. Note: it's not "safety" any more than speed limits.I was privy to a presentation by IIHS (Insurance Institute for Highway Safety) not too long ago on the subject of autonomous vehicles. I was prepared to be bored stiff for 3 hours but instead was captivated. Some of the things mentioned:
1. It was a challenge to get the government on board with driverless vehicles. Now they're the biggest proponent.
2. It does/will save lives, and that's already happened thousands of times. Everyone is keenly aware that there will be accidents that may have been prevented if a human was behind the wheel, but those are greatly outweighed by the number of accidents prevented by a person not being in control. The computer can predict and react exponentially faster than a person.
3. As someone else said, there will be a time when we're older and less capable. This is an ideal solution.
4. Right now my wife, daughter and I each have our own vehicles. With autonomous vehicles we would really only need two. When one of us is not currently using a vehicle and another wants it we could just summon it.
5. This is a problem for the insurance, auto and collision repair industries that many companies aren't preparing for. And the changes are coming much faster than most are predicting. Right now insurers are insuring a lot of vehicles, collecting a lot of $ in premiums, and paying for a lot of repairs. In the very near future we'll be building, insuring and repairing WAY fewer vehicles. Currently 70% of the premium dollars collected by P&C insurers come from auto policies. When that number drops substantially it's going to have major effects.
I'm all for it and I'm sure there will be a time I'll own one. But don't get me wrong. I LIKE driving. And I like driving fast. I want the ability to occasionally zip through traffic, push the envelope on yellow lights, and do other stupid things behind the wheel.
With no one on the crossing road...I think you mean: The left turn arrow turns green. The first car finally looks up from their cellphone and goes, the second car stares at the back of the first car until they are all the way through the intersection and half a block down the next street then slowly creeps through the intersection as the light turns yellow. I finally make it to the front with a red light and am stuck along with 5 other cars for another cycle.
4. Right now my wife, daughter and I each have our own vehicles. With autonomous vehicles we would really only need two. When one of us is not currently using a vehicle and another wants it we could just summon it.
I hope they learn fast. For the past seven years (that I know of -- it could be longer), Google Maps has been routing traffic over this lovely "road" as one leg of the best route to my home from points southeast:
Garmin and TomTom used to, as well; but they had the good sense to make some sort of notation in their basemaps to avoid this "road" after enough people complained about it. So did Open Street Maps. All it took was those two pictures, and they made the change. Google, not so much. As recently as this past weekend, Google was still routing traffic over it.
Rich
1. If the government is "for" something that tells me to give it the hairy eyeball. Note: it's not "safety" any more than speed limits.
Having been employed by local gov't during a cutover to electronic voting, I can tell you the average village idiot coud defeat the security measures as implemented and monitored. Gov't at just about every level really isn't competent to handle important data safely, with a very few exceptions. There just aren't any real consequences to screwing the pooch. I think OPM or VA fired somebody or other, but basically, no consequences beyond those scapegoatsThat's actually how I felt when the push was on to replace the old "lever-type" voting machines with electronic voting; and my feelings are even stronger today, when FedGov is pushing to replace the new electronic voting machines with even newer electronic voting machines in the wake of the Russian scanning of voting machines in 21 states in the 2016 elections.
The old voting machines were non-hackable because they had no network connections. Many didn't even require electricity except for the overhead lights. They created a paper trail to make recounts easier and help insure integrity. On the extremely rare occasions when they malfunctioned, the votes already cast were not affected. I saw no reason to replace them, and the government's push to do so made me suspicious. Why replace a tried-and-true machine that could not be hacked with an untested one that could?
Now we know that the machines in 21 states were "scanned" by the Russians; and it's possible (although not proven) that votes could have been changed. That would have been impossible with the old machines. But we can't switch back to them because not only have they been retired, but they've been thrown overboard in the Atlantic to create artificial reefs.
Why? What was the hurry to get rid of them other than to force municipalities to use the new, network-capable (and hackable) machines?
To me, common sense says that if you believe that voting machines were hacked over a network, the easiest solution is to get them off the network. There's absolutely no reason why they need to be networked. We managed to vote for well over 200 years without that technology. So why does FedGov insist on it today and want to impose it on all the states? It makes no sense. I can't think of a non-nefarious reason to prefer a hackable system over a non-hackable one.
So when I hear that the government prefers self-driving cars, that just adds to all the other reasons I don't like the idea. I can think of a lot more bad reasons why the government would like them than good ones.
Rich
Let’s say a trip from Seattle to San Francisco. Fill up one of your cars with luggage and send it down the night before. Next day the second car drops you off at the airport and returns home after. You fly down with the Cirrus and meet the car with luggage at the airport there and use it. Week later you reverse the process.
That's actually how I felt when the push was on to replace the old "lever-type" voting machines with electronic voting; and my feelings are even stronger today, when FedGov is pushing to replace the new electronic voting machines with even newer electronic voting machines in the wake of the Russian scanning of voting machines in 21 states in the 2016 elections.
Having been employed by local gov't during a cutover to electronic voting, I can tell you the average village idiot coud defeat the security measures as implemented and monitored. Gov't at just about every level really isn't competent to handle important data safely, with a very few exceptions. There just aren't any real consequences to screwing the pooch. I think OPM or VA fired somebody or other, but basically, no consequences beyond those scapegoats
I think self-driving will safe lives, improve traffic. It'll just save the a lot of the wrong lives. Hopefully, they'll still require the human to have a driver's license, and hold the "PIC" accountable for failing to intervene when the AI effs up.
I'll be really surprised if the accident rate for self-driving cars is anything like ten times better than human driven ones during my lifetime.Suppose that's the case. How long until the first lawsuit where it's pointed out that the defendant *deliberately* disengaged the autonomous features, knowing full well that the accident rate with him behind the wheel was 10x that of when he let the car drive itself? Would that be reckless endangerment?
Autonomous cars are pretty exciting tech, but I find the potential implications troubling.
Where I drive, most people don't leave gaps that big when a line of cars starts up.I see some pluses to autonomous cars. Let's say we're sitting at a stop light in a line of 10-20 cars and the light turns green. Today, the first car goes, then the second begins to move, then the third and so and so on. This generally makes for large gaps in the line of traffic. With all of the vehicles being autonomous each car could begin moving sooner and accelerate at the same rate as the car in front of it keeping the traffic moving much better.
I do figure some of you already try to do that, but most folks don't work that way.
I find myself wondering how long it will take for self-driving cars to learn the necessary skills. Will we see vehicles with "student self-driving car" signs on them, and will there be a "driving test" to determine when the car has learned enough to drive safely on its own?The computer won't be programmed with that knowledge. It'll learn by itself about that scenario. Just like you did.
So you'd be okay with autonomous hearses?