I think that expression is more often applied to Grand Juries.
I always thought it meant the DA can get the Grand Jury to do what the DA wants, to wit, return an indictment against a ham sandwich.
Cheers
I think that expression is more often applied to Grand Juries.
I always thought it meant the DA can get the Grand Jury to do what the DA wants, to wit, return an indictment against a ham sandwich.
Cheers
They must have backed up a dump truck full of cash and said, "Sign here."
Or making its own.Big help to learning, but also a big vulnerability to malicious intrusion. Imagine your entire network learning some very very bad habits....
More family members of a woman killed by an Uber self-driving vehicle have hired legal counsel,
All hoping to become independently wealthy....cha CHING..!!!
Perhaps but it sounds like it’s really just a split family thing. Divorce and two households and separate lawyers. One lawyer waited until the other was done.
I can only imagine how the safety driver in this accident car is feeling now.
Myself, if one of my immediate family members were killed I would probably not be satisfied until I see a bill signed into law banning driverless vehicles from public roads.
The emotions are certainly understandable, but isn't this the kind of attitude that makes general aviation increasingly expensive, and under attack?
Its pretty much an attitude that would stifle all new development. The car, the plane, the boat, the train, the bicycle, electricity, natural gas in the home, <insert most any modern technology> have all killed people. Of course, so did rocks and sticks before that, so I guess we're all effed...
Think of captchas. Like the deal when log on to some site and it asks you to click on every street sign in an image, or every storefront, or something like that. Captchas like this work because they are very easy for humans and very difficult for computers. Driving down a busy street is like one big captcha.
A Tesla hit a barricade the other day. Driver had engaged autopilot and took his hands off the wheel.
https://www.digitaltrends.com/cars/tesla-autopilot-fatal-crash-warnings-ignored/
In this accident, initial information suggests that the vehicle failed to detect the obstacle (person and bicycle). That doesn't seem like a "learning" problem to me but an issue with the sensor system. A computer can't learn to avoid something that it can not "see".
There's another Tesla in the news today at that same accident location:
https://jalopnik.com/video-appears-to-show-tesla-autopilot-veering-toward-di-1825016336
Apparently there are lane markers that under certain sun angles fool the guidance system.
I think they should start using a different terrm then for their “driver assistance system”. In my airplane, I can comfortably take my hands off the controls for long stretches of time and let the autopilot fly the plane. Tesla is calling their system an autopilot, but it really can’t be trusted (no surprise there) to do the job of a real autopilot.
If they really want to have the driver paying atention at all times, they could have attention detection systems or hands on wheel sensors to enforce that. This is just their mealy mouthed way of trying to evade responsibility for putting a system in a car which is sold as being able to self-drive and is not up to the job.
Everywhere in the vehicle it's just called AutoSteer. AutoSteer (Beta) actually. AutoSteer (Beta) + TACC to be exact.
I've never seen the vehicle refer to the system as AutoPilot. That's a marketing term.
It DOES have hands-on-wheel sensors. In fact, I have to grip the wheel tighter on AutoSteer than I grip it if I'm just steering myself. If you don't do that it disengages. If it disengages 3 times, you have to park the vehicle before you can engage it again.
I think they should start using a different terrm then for their “driver assistance system”. In my airplane, I can comfortably take my hands off the controls for long stretches of time and let the autopilot fly the plane. Tesla is calling their system an autopilot, but it really can’t be trusted (no surprise there) to do the job of a real autopilot.
But there are many autopilot systems in GA planes that will happily fly the aircraft into the side of a mountain without human intervention.
Well, in their quote, they called it Autopilot. And they call it Autopilot on their web site. Their website also says the car is capable of self driving.
Not disputing what you say about having to have hands on wheel, but I wonder why the deaths in that case? Clearly something is wrong with the system and/or with the training of the drivers.
The logs show that the driver removed his hands from the wheel for 6 seconds. (He received warnings). He was likely looking down at his phone and ignore the flashing screens.
The Tesla yesterday was in the same lane. The vehicle followed the lane markings which began aiming it directly at the same barrier. This driver took over. There were no warnings about lane departure.
The accident driver may have been getting warnings about driving hands-free, but might not have gotten any warning of an impending collision.
Huh. I know Tesla drivers who claim their cars basically drive them to work. Not true?
That's like alien technology compared to what I fly, but it looks likeyou can punch in a GPS route and that autopilot will follow it, even flying an approach. I'd that's what the Tesla does, that's fairly autonomous.As true as it would be to claim: "My GFC 500 basically flew me from LA to Vegas this morning".
If you want something done right, trust it to a 2500lb, 60 mph Roomba.A drivered car managed to sideswipe me yesterday. The driver did a poor job of it and only managed some nearly invisible scratches on some decorative plastic bits. I didn’t have time to dodge the lane change swerve by the other car so I didn’t materially contribute to the poor quality of the ‘crash’.
Question: would a driverless car done a better job of hitting my vehicle and actually removed paint and/or bent metal?
That's like alien technology compared to what I fly, but it looks likeyou can punch in a GPS route and that autopilot will follow it, even flying an approach. I'd that's what the Tesla does, that's fairly autonomous.
According to two anonymous sources who talked to Efrati, Uber's sensors did, in fact, detect Herzberg as she crossed the street with her bicycle. Unfortunately, the software classified her as a "false positive" and decided it didn't need to stop for her.
When this thread started, I was pretty optimistic about driverless car tech. Now that more fact have emerged, I think we're a long way off. When the Uber car drives right into a moving human-size object, and the Tesla will happily slam you into a concrete barrier at 60+ mph (faded lane markings? Deal with it.. humans do), that tells me that we haven't even solved the easy problems. And that's not even getting to the difficult edge cases.
When this thread started, I was pretty optimistic about driverless car tech. Now that more fact have emerged, I think we're a long way off. When the Uber car drives right into a moving human-size object, and the Tesla will happily slam you into a concrete barrier at 60+ mph (faded lane markings? Deal with it.. humans do), that tells me that we haven't even solved the easy problems. And that's not even getting to the difficult edge cases.
Maybe so, but earlier in the thread, it was pointed out that the fatalities per mile for self-driving cars has a long way to go before it gets better than for human drivers.All the way at the beginning of this thread someone asked "is it seen as different when a human driver takes a life or an automated driver?" (paraphrased). I think yes....
Maybe so, but earlier in the thread, it was pointed out that the fatalities per mile for self-driving cars has a long way to go before it gets better than for human drivers.
Me neither!I don't have a lot of faith in self-driving cars....
...the stats you mention are a little unreliable. It's only when they get a decent number of miles that it is going to be able to match.
How long before the first fatality?I read somewhere about the advent of the automobile. The very FIRST recorded automobile accident happened pretty early on. If you used those stats they'd probably be worse than self-driving now are.