Posted on 03/29/2018 2:22:57 PM PDT by Wolfie
Self-driving car passenger slapped with ticket in San Francisco, police say
A ticket was issued to a person traveling in a self-driving car in San Francisco on Monday, police told Fox News. The vehicle allegedly did not stop for a person in the crosswalk. However, Cruise, the car company involved, according to KPIX, maintained that the vehicle was in compliance with California state law.
A mottorcycle officer issued the ticket after seeing the car not stop for a woman going through a crosswalk in the South of Market area, San Francisco Police Department spokeswoman Officer Giselle Linnane told Fox News on Wednesday. The car cut the pedestrian off, she said.
The ticketing officer believed that the car was in self-driving mode, however the person inside was cited for failing to yield to a pedestrian, Linnane said. That individual, whether they were driving or not, is still responsible for the vehicle, she added.
(Excerpt) Read more at foxnews.com ...
Elevator’s not worthy.
“Common sense tells you this is a very bad idea. People are going to get hurt.”
People are going to get hurt regardless.
Well said. As proof, look who wasn’t bright enough to comprehend the parallel.
The “Justice 4 Reyna” sycophant.
A machine cannot be guilty of a crime. Only a person can have points put on his license.
Current law does not contemplate "self driving" cars. If you get into one, and thus set it into motion, then YOU are the person who will be held responsible for what happens. You might have a contract with the company to be reimbursed for any fines resulting from the operation of the car, but YOU are going to be held legally responsible for what it does.
Something to never forget
Here’s a more interesting issue: these self-driving cars will also be used to deliver packages, or food.
So the car arrives at a take-out place, food gets put in, it drives towards house of the guy who placed the order, and runs over somebody on the way. Who gets sued? Most likely the trial lawyers will sue everyone involved.
These are have no logic. Running over people is following the law.
>
Its whatever the officer observed at the scene and from his observation, there was a violation.
>
Judge Dredd, is that you? “I *am* the Law”
Death to self driving cars. You aren’t in control, but responsible for everything? No thanks.
And it’s worthless if you cant drink at the bar and have it take you home.
They need to ban these cars. As long as a pedestrian steps off the curb into a crosswalk, a vehicle must stop to let the pedestrian continue to the next curb. (Of course if there is a light signal, the pedestrian must obey it.) Years ago, my wife got ticketed in SF for continuing through an open lane when the adjoining lane had a stopped car. Now she comes to a stop to see the other car had stopped for a pedestrian to cross.
If you look at pedestrian accident videos on YouTube, it seems the majority of injured pedestrians happen due to a car blowing through an open lane next to another vehicle that stopped. That's the primary reason this law exists. It doesn't matter that the pedestrian was almost 11 feet away from the vehicle; the law says the vehicle must stop for the pedestrian in the crosswalk no matter how far away from the vehicle. Bad programming in this self-driving car; if they can't reprogram it to follow the law, then it should not be allowed on public streets.
After the LEO was done writing the ticket and was ready to walk away, the self-driving car said, “Thank you Officer. Have a nice day.”
Bad news: Robot cars do not stop for pedestrians. That is the second time nearly running over somebody.
Good news: It is only women pedestrians at risk. They must use a gender identifier software function.
I don’t know if it makes a difference, but was the passenger in the driver’s seat?
Was this one of those “one foot” stings?
There was a town here in NJ that had people stand at the crosswalk but on the sidewalk and wait for cars to approach. The pedestrian would put one foot in the crosswalk and if the driver of the car didn’t lock it up, they got a ticket.
“I dont know if it makes a difference, but was the passenger in the drivers seat?”
The persons ticketed in this case was not a client but the designated human safety driver in the vehicle.
https://www.freep.com/story/money/cars/2018/03/29/self-driving-cars-ticket/469486002/
No its the software company, car company and the city that allowed it to be on the road.
It would be extremely expensive to equip self-driving cars to reliably avoid running over pedestrians.
Drivers are responsible for what their equipment does to others. Corporations selling the equipment will make sure of that to whatever extents possible.
Good question - obviously not paying attention as was the purpose of having a human attendant with override ability.....Uber will probably pay out a lot for disabling the car's normal safety features though...
“It would be extremely expensive to equip self-driving cars to reliably avoid running over pedestrians.”
Yes because technically that requires triple redundant did similar fail operational systems from sensors through the entire processing chain.
Do it RIGHT would cost WAY more than these companies are willing to spend. So they cut corners and make something that kinda sorta works. They need to be held liable for every death to the maximum extent of the law.
I do avionics. When we mess up tens of millions of dollars change hands. As a result? Flying (in the US) is extremely safe.
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.