Posted on 10/13/2019 7:14:49 AM PDT by rickmichaels

I actually believe it does take a village to raise a child. But it sure as hell doesnt take my village for you to test your self-driving car.
No, Im still not over Uber testing its self-driving technology in Tempe, Arizona in March of 2018. With the so-called safety driver behind the wheel as the car ferried itself around the city, a woman crossing the street with, not on, her bicycle was killed.
The vehicle never even slowed. Uber had disabled Volvos automatic emergency braking system in order to reduce potential for erratic vehicle behavior, according to The Verge. The safety driver was watching The Voice on her phone, and Uber had previously cut the number of safety drivers in their experiments from two to one.
Hey, Uber? How do I hate thee, let me count the ways.
I was recently with Honda at their testing facility in Ohio, and part of their Honda Sensing safety systems is a pedestrian detection and collision mitigating braking system that stood that car on its hood to avoid hitting a pop-up fake pedestrian. My notebook flew off the seat.
Most manufacturers have a version of this now, and there is no excuse for a company like Uber to disable something like that, especially when theyre testing the accuracy of their mapping skills and hiring people who would rather be home on their couches voting for their favourite singer in some contest.
Ive long been ired by Tesla installing something called Autopilot in its vehicles. The fine print always says driver must remain able to take control of the car at all times blah blah blah but you tell me what something called Autopilot means to average people.
You may think we love running video after video of morons literally asleep at the wheel. We dont. People die. One driver, the fourth killed in similar circumstances, had Autopilot engaged and died when a large truck crossed his Teslas path. Good enough is not enough.
It is deeply irresponsible to not take into account how your new product will be received and used by the general public. I remember (and Im dating myself) when we had to pull lawn darts from the shelves of the Consumers Distributing I worked at because a handful of people had managed to impale themselves with the metal shafts. In hindsight, they really were a little violent, to be honest, but at least everybody in the yard knew they were flinging them around.
Dumping self-driving cars on the public roadways to test your tech and your theories? Thats just a lawn dart that comes out of nowhere.
Watch this weeks videos of people remotely telling their Teslas to meet them across a parking lot, only to have the cars filmed looking like theyre in a bumper car midway game. And thats not the Autopilot feature, just one using an application called Smart Summon. If Im in a parking lot and a Tesla, sans driver, is headed toward me and shows no signs of stopping, prepare for a showdown, both physical and legal.
Maybe its the medias fault for the screaming headlines about autonomous cars, and the wave of the future being here right now. Its not. Autonomous features, now thats a discussion worth having. Lane departure correction, front collision avoidance, parking assist, trailer back-up: these types of systems are about safety.
Will they one day form the basis for autonomous cars? Sure they will. But having the technology does not mean having the right to inflict it on the unwilling and the unknowing; other drivers on our roads are not your guinea pigs to perfect your technological skill set.
In some parts of the world, new drivers have a prominent sign or symbol on their car to warn those around them that a newbie is behind the wheel, and to keep some distance and cut them some slack. If you want to toss a vehicle onto the roads in my city to test your autonomous technology, it better be lit up like a Christmas tree so I can decide if I want to take my chances stepping off a curb near it, or entering a lane of traffic.
You dont have to take my Luddite word for it. Read this Washington Post article about Silicon Valley residents with expertise in this very area not wanting these cars in their community. The proliferation of Waymo self-driving cars has them nervous even as they understand the only way to advance the technology is to try it in the real world. They just dont necessarily want it in their world. Neither do I.
Years and years ago, a rep from Mercedes-Benz told me at a facility in Germany it used for testing just such safety features, the cars were performing very well. Until at an intersection, one of the vehicles stopped, allowing a pedestrian to cross as per traffic rules. The pedestrian waved the car onward.
The car was confused and didnt know what to do.
You can factor in all the code you like, but let me know when youve cracked the human one.
We were promised flying cars, and got this BS instead.
I was thinking the exact same thing. The only way driverless cars are going to work is on a completely closed course with sensor wires embedded in the road. It ain’t gonna happen.
These vehicles will have a role. Long range trucking for sure. Fill the truck, program the destination and go. No more sleep deprived (or worse) drivers. It’s a natural. Not sure how they handle unexpected police guided detours but they will figure it out. Sorry drivers but nothing is forever.
This was a TEST CAR. The Uber driver, Ms. Rafaela Vasquez, was supposed to be watching the road. Instead, she was on her phone at the moment of impact.
She should have been charged with involuntary vehicular homicide.
You just made his point...a test driver was in the car - but doing something else. Do you really think it will be any different when “regular” people are in the car...they will always be doing something else...because the car is on “auto pilot”...
What could possibly go wrong?
I agree with you.
The most dangerous vehicles on the road are driven by humans. It is a terrifying reality that most of us have suppressed in our minds just so we can get to where we need to go.
I would LOVE if every care and truck was self drive, tomorrow. The tech isn’t there, yet, but I long for the day it is.
Americans are regularly being threatened and robbed of their individual rights by political corporatists who grovel to money and Wall St instead of balancing rights of citizens. We see the loss of individual rights with risky technology and NBA and one way trade pacts. The relationship between globalism, big business, Koch Bros, individual rights and conservatism is being rebalanced..
The left does not support individual rights.
If I'm in the drivers seat and there's no wheel and no breaks and my car is heading toward a crowd or a brick wall, I want to have the ability to turn a wheel or step on the break.
I don't like the idea of being in a projectile without any control which I believe is the ultimate goal.
I appreciate the self-driving option and computer sensors for gauging car distance and emergency breaking.
For a long drive without much traffic I would appreciate the full computer mode. I wouldn't be 'sleeping at the wheel' but taking my hands off the wheel and stretching would be nice.
Just another bad idea in a long list of such by the manufacturers, but in fairness they do occasionally come up with some really good ones. The trouble with those is that they either don’t fully develop them or they are not around long enough to accomplish this.
Just wait until the first autonomous car is used to deliver a car bomb.
One big flaw with assuming that a driver will be able to take over the car in an emergency is the assumption that the driver will have decent driving skill after being no more than a passenger most of the time the car is on the road. (Ive heard that some self driving cars do not have driver controls, so there isnt even that failsafe.) Personally, I do not think that cars can be programmed to extricate themselves from any situation. Every contingency must be programmed in. In an unpredictable world, that is exceedingly difficult.
Actually, I think that a lot of the safety features programmed into newer cars are counterproductive. They are teaching people to be careless instead of attentive. What happens when the driver of the car with collision avoidance built in becomes so complacent that they no longer pay much attention to the cars around them, and the system breaks down? Will they be able to successfully sue the car company for faulty manufacture, or will the suit go nowhere because the car company argues that the system was never meant to replace human diligence?
I hate the idea robo cars too but an RV that can drive itself while I drink heavily in the back sounds really appealing.
As I wrote a moment ago, and has you note for delivery, automated cars are perfect to deliver car bombs.
That.
Where do I sign up? I’m retired, and want to drive across 13 states twice a year. (To vacation-home from winter-home).
When self driving cars are perfected, the old argument about when to “take away the car keys” from an elderly parent will be solved. If they need to go to the doctor, just pile them in, program the car, and send them on their way.
All car will be self driving within 10 years says ex GM CEO Peter Lutz
There is a famous photo of a NYC street in 1905. 1 car and a hundred horses. Same exact street in a 1910 a photo shows 100 cars and one horse.
All the car co’s are gearing up for self driving electric vehicles.
Scary part is who codes the software in these cars? $9/hr H1B shit from India, just like in the 737MAX crashes?
Get this shit off our roads until they have competent Americans writing the control code for the vehicle and the sensors.
Indian software quality is as bad as Communist Chinese manufacturing quality. Look at the hot mess that is now Microsoft Windows.
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.