Free Republic
Browse · Search
General/Chat
Topics · Post Article

Ya can't fix stupid. There has already been a number of accidents, some of them fatal, attributed to this auto pilot system. As far as I know, all of them have been operator error. I could be wrong though.

Short video at site someone took of this idiot.

1 posted on 04/29/2018 4:09:18 PM PDT by BBell
[ Post Reply | Private Reply | View Replies ]


Reminds me of this:

Winnebago winner?

"This year's runaway First Place Stella Award winner was Mrs. Merv Grazinski , of Oklahoma City, who purchased a new 32-foot Winnebago motor home. On her first trip home, from an OU football game, having driven on to the freeway, she set the cruise control at 70 mph and calmly left the driver's seat to go to the back of the Winnebago to make herself a sandwich.

"Not surprisingly, the motor home left the freeway, crashed and overturned. Also not surprisingly, Mrs. Grazinski sued Winnebago for not putting in the owner's manual that she couldn't actually leave the driver's seat while the cruise control was set.

"The Oklahoma jury awarded her – are you sitting down? – $1,750,000 plus a new motor home. Winnebago actually changed their manuals as a result of this suit, just in case Mrs. Grazinski has any relatives who might also buy a motor home."

2 posted on 04/29/2018 4:10:17 PM PDT by BBell (calm down and eat your sandwiches)
[ Post Reply | Private Reply | To 1 | View Replies ]

To: BBell

He could just claim cultural necessity...If he’s a raghead...


3 posted on 04/29/2018 4:12:15 PM PDT by SuperLuminal (Where is another agitator for republicanism like Sam Adams when we need him?)
[ Post Reply | Private Reply | To 1 | View Replies ]

To: BBell

He could take the Hillary defense and say he didn’t “intend” to commit a crime. Naw, that only works for Hillary.


4 posted on 04/29/2018 4:14:47 PM PDT by Telepathic Intruder
[ Post Reply | Private Reply | To 1 | View Replies ]

To: BBell

Starting to look like overconfidence in the systems is going to be one of the biggest initial hurdles.


6 posted on 04/29/2018 4:16:28 PM PDT by DannyTN
[ Post Reply | Private Reply | To 1 | View Replies ]

To: BBell
"40mph"

I guess the witness were US tourist.

Can't picture a Brit used mph vs kph.

7 posted on 04/29/2018 4:17:42 PM PDT by Deaf Smith (When a Texan takes his chances, chances will be taken that's62 fore sure)
[ Post Reply | Private Reply | To 1 | View Replies ]

To: BBell

4 fatalities:

Handan, China (January 20, 2016)
On January 20, 2016, the driver of a Tesla Model S in Handan, China was killed when their car crashed into a stationary truck.[80] The Tesla was following a car in the far left lane of a multi-lane highway; the car in front moved to the right lane to avoid a truck stopped on the left shoulder, and the Tesla, which the driver’s father believes was in Autopilot mode, did not slow before colliding with the stopped truck.[81] According to footage captured by a dashboard camera, the stationary street sweeper on the left side of the expressway partially extended into the far left lane, and the driver did not appear to respond to the unexpected obstacle.[82]

In September 2016, the media reported the driver’s family had filed a lawsuit in July against the Tesla dealer who sold the car.[83] The family’s lawyer stated the suit was intended “to let the public know that self-driving technology has some defects. We are hoping Tesla, when marketing its products, will be more cautious. Don’t just use self-driving as a selling point for young people.”[81] Tesla released a statement which said they “have no way of knowing whether or not Autopilot was engaged at the time of the crash” since the car telemetry could not be retrieved remotely due to damage caused by the crash.[81] Telemetry was recorded locally to a SD card and given to Tesla, who decoded it and provided that data to a third party for independent review. Tesla added that “while the third-party appraisal is not yet complete, we have no reason to believe that Autopilot on this vehicle ever functioned other than as designed.”[84]

Williston, Florida (May 7, 2016)
The first known fatal accident involving a Tesla engaged in Autopilot mode took place in Williston, Florida, on May 7, 2016. The driver was killed in a crash with a 18-wheel tractor-trailer.

By late June 2016, the U.S. National Highway Traffic Safety Administration (NHTSA) opened a formal investigation into the accident, working with the Florida Highway Patrol. According to the NHTSA, preliminary reports indicate the crash occurred when the tractor-trailer made a left turn in front of the Tesla at an intersection on a non-controlled access highway, and the car failed to apply the brakes. The car continued to travel after passing under the truck’s trailer.[85][86][87] The diagnostic log of the Tesla indicated it was traveling at a speed of 74 mi/h (119 km/h) when it collided with and traveled under the trailer, which was not equipped with a side underrun protection system.[88]:12 The underride collision sheared off the Tesla’s glasshouse, destroying everything above the beltline, and caused fatal injuries to the driver.[88]:6–7; 13 Approximately nine seconds after colliding with the trailer, the Tesla traveled another 886.5 feet (270.2 m) and came to rest after colliding with two chain-link fences and a utility pole.[88]:7; 12

The NHTSA’s preliminary evaluation was opened to examine the design and performance of any automated driving systems in use at the time of the crash, which involves a population of an estimated 25,000 Model S cars.[89] On July 8, 2016, the NHTSA requested Tesla Inc. to hand over to the agency detailed information about the design, operation and testing of its Autopilot technology. The agency also requested details of all design changes and updates to Autopilot since its introduction, and Tesla’s planned updates scheduled for the next four months.[90]

According to Tesla, “neither autopilot nor the driver noticed the white side of the tractor-trailer against a brightly lit sky, so the brake was not applied.” The car attempted to drive full speed under the trailer, “with the bottom of the trailer impacting the windshield of the Model S.” Tesla also stated that this was Tesla’s first known Autopilot-related death in over 130 million miles (208 million km) driven by its customers while Autopilot was activated. According to Tesla there is a fatality every 94 million miles (150 million km) among all type of vehicles in the U.S.[85][86][91] It is estimated that billions of miles will need to be traveled before Tesla Autopilot can claim to be safer than humans with statistical significance (although fewer than billions of miles will be needed if Tesla Autopilot is more dangerous). Researchers say that Tesla and others need to release more data on the limitations and performance of automated driving systems if self-driving cars are to become safe and understood enough for mass market use.[92][93]

The truck’s driver told the Associated Press that he could hear a Harry Potter movie playing in the crashed car, and said the car was driving so quickly that “he went so fast through my trailer I didn’t see him.” “It was still playing when he died and snapped a telephone pole a quarter mile down the road.” According to the Florida Highway Patrol, they found in the wreckage an aftermarket portable DVD player. It is not possible to watch videos on the Model S touchscreen display.[87][94] A laptop computer was recovered during the post-crash examination of the wreck, along with an adjustable vehicle laptop mount attached to the front passenger’s seat frame. The NHTSA concluded the laptop was probably mounted and the driver may have been distracted at the time of the crash.[88]:17–19; 21

Dr. Deb Bruce, head of the investigation team, announces results to the Board on September 12, 2017
In July 2016, the U.S. National Transportation Safety Board (NTSB) announced it had opened a formal investigation into the fatal accident while Autopilot was engaged. The NTSB is an investigative body that only has the power to make policy recommendations. An agency spokesman said, “It’s worth taking a look and seeing what we can learn from that event, so that as that automation is more widely introduced we can do it in the safest way possible.” The NTSB opens annually about 25 to 30 highway investigations.[95] In September 2017, the NTSB released its report, determining that “the probable cause of the Williston, Florida, crash was the truck driver’s failure to yield the right of way to the car, combined with the car driver’s inattention due to overreliance on vehicle automation, which resulted in the car driver’s lack of reaction to the presence of the truck. Contributing to the car driver’s overreliance on the vehicle automation was its operational design, which permitted his prolonged disengagement from the driving task and his use of the automation in ways inconsistent with guidance and warnings from the manufacturer.”[96].

In January 2017, the NHTSA Office of Defects Investigations (ODI) released a preliminary evaluation, finding that the driver in the crash had seven seconds to see the truck and identifying no defects in the Autopilot system; the ODI also found that the Tesla car crash rate dropped by 40 percent after Autosteer installation.[97][98] The NHTSA Special Crash Investigation team published its report in January 2018.[88] According to the report, for the drive leading up to the crash, the driver engaged Autopilot for 37 minutes and 26 seconds, and the system provided 13 “hands not detected” alerts, to which the driver responded after an average delay of 16 seconds.[88]:24 The report concluded “Regardless of the operational status of the Tesla’s ADAS technologies, the driver was still responsible for maintaining ultimate control of the vehicle. All evidence and data gathered concluded that the driver neglected to maintain complete control of the Tesla leading up to the crash.”[88]:25

Culver City, California (January 22, 2018)
On January 22, 2018, a Tesla Model S crashed into a fire truck parked on the side of the I-405 freeway in Culver City, California while traveling at a speed exceeding 50 mph (80 km/h) and the driver survived.[99] The driver said he was using Autopilot, according to the Culver City Fire Department, which reported the crash over Twitter at approximately 8:30 A.M. The fire truck and a California Highway Patrol vehicle were parked in the left emergency lane and carpool lane of the southbound 405, blocking off the scene of an earlier accident, with emergency lights flashing.[100]

Autopilot may not detect stationary vehicles at highway speeds and it cannot detect some objects.[101] Other advanced driver-assistance systems have similar limitations. Raj Rajkumar, who studies autonomous driving systems at Carnegie Mellon University, believes the radars used for Autopilot are designed to detect moving objects, but are “not very good in detecting stationary objects”.[102] Both NTSB and NHTSA have dispatched teams to investigate the crash.[103] Hod Lipson, director of Columbia University’s Creative Machines Lab, faulted the diffusion of responsibility concept: “If you give the same responsibility to two people, they each will feel safe to drop the ball. Nobody has to be 100%, and that’s a dangerous thing.”[104]

Mountain View, California (March 23, 2018)
On March 23, 2018, a second US Autopilot fatality occurred in Mountain View, California.[105] The crash occurred just before 9:30 A.M. on southbound US 101 at the carpool lane exit for southbound Highway 85, at a concrete barrier where the left-hand offramp separates from 101. After the Model X crashed into the narrow concrete barrier, it was struck again by two following vehicles, and then it caught on fire.[106]

Both the NHTSA and NTSB are investigating the March 2018 crash.[107] Another driver of a Model S demonstrated that Autopilot appeared to be confused by the road stripes in April 2018. The gore ahead of the barrier is marked by diverging solid white lines (a vee-shape); the Autosteer feature of the Model S appeared to mistakenly use the left-side white line instead of the right-side white line as the lane marking for the far left lane, which would have led the Model S into the same concrete barrier had the driver not taken control.[108] Ars Technica concluded “that as Autopilot gets better, drivers could become increasingly complacent and pay less and less attention to the road.”[109]

In a corporate blog post, Tesla noted the impact attenuator separating the offramp from US 101 had been previously crushed and not replaced prior to the Model X crash on March 23.[105][110] The post also stated that Autopilot was engaged at the time of the crash, and the driver’s hands had not been detected manipulating the steering wheel for six seconds before the crash. Vehicle data showed the driver had five seconds and 150 metres (490 ft) “unobstructed view of the concrete divider, [...] but the vehicle logs show that no action was taken.”[105] The NTSB investigation had been focused on the damaged impact attenuator and the vehicle fire after the collision, but after it was reported the driver had complained about the Autopilot functionality,[111] the NTSB announced it would also investigate “all aspects of this crash including the driver’s previous concerns about the autopilot.”[112] A NTSB spokesman stated the organization “is unhappy with the release of investigative information by Tesla”.[113] Elon Musk dismissed the criticism, tweeting that NTSB was “an advisory body” and that “Tesla releases critical crash data affecting public safety immediately & always will. To do otherwise would be unsafe.”[114]


8 posted on 04/29/2018 4:20:35 PM PDT by Eddie01
[ Post Reply | Private Reply | To 1 | View Replies ]

To: BBell
I see nothing wrong, here.

He obviously identifies as a passenger and not a driver.

Probably born that way and it should be covered by governmental freebies.

12 posted on 04/29/2018 4:22:55 PM PDT by knarf (I say things that are true, I have no proof, but they're true)
[ Post Reply | Private Reply | To 1 | View Replies ]

To: BBell

which is worse, driver who turns on AutoPilot and goes to sleep...... or driver who turns on Cocaine, Meth, Weed, and Heroin, drinks a few litres of whisky, and stays behind the wheel?


14 posted on 04/29/2018 4:27:27 PM PDT by faithhopecharity ("Politicans aren't born, they're excreted." -Marcus Tillius Cicero (3 BCE))
[ Post Reply | Private Reply | To 1 | View Replies ]

To: BBell
Bhavesh Patel, aged 39, of Alfreton Road, Nottingham, pleaded guilty to dangerous driving at St Albans Crown Court on Friday, April 20.

Why? He wasn't driving. Your Honor.

17 posted on 04/29/2018 4:32:53 PM PDT by Billthedrill
[ Post Reply | Private Reply | To 1 | View Replies ]

To: BBell

On the other hand, the portion posted doesn’t prohibit him from doing it again.


20 posted on 04/29/2018 4:36:21 PM PDT by lepton ("It is useless to attempt to reason a man out of a thing he was never reasoned into"--Jonathan Swift)
[ Post Reply | Private Reply | To 1 | View Replies ]

To: BBell

Also Tesla states that auto pilot is NOT full autonomous driving and the driver needs to be ready to take control. So this guy is an idiot no matter how you look at it.


34 posted on 04/29/2018 6:25:42 PM PDT by battousai (Trump was wrong... I'm still not tired of Winning!)
[ Post Reply | Private Reply | To 1 | View Replies ]

To: BBell

It sounds like he doesnt want to drive anyway.


43 posted on 04/29/2018 7:39:12 PM PDT by Democrats hate too much
[ Post Reply | Private Reply | To 1 | View Replies ]

To: BBell

(1) He wasn’t driving at the time so how can he be banned from driving?

(2) The stated end design of these cars has no user operated brake or wheel.

Incrementalism will take us there.


46 posted on 04/30/2018 5:44:50 AM PDT by a fool in paradise (Ads for Chappaquiddick warn of scenes of tobacco use. What about the hazards of drunk driving?)
[ Post Reply | Private Reply | To 1 | View Replies ]

To: BBell

Yea, lucky he didn’t wind up like the other fool who was decapitated in his Tesla on Autopilot when it decided it could drive under a semi crossing his path.


48 posted on 04/30/2018 5:47:42 AM PDT by HamiltonJay
[ Post Reply | Private Reply | To 1 | View Replies ]

To: BBell

http://nationalpost.com/news/world/disaster-on-autopilot-how-too-much-of-a-good-thing-can-lead-to-deadly-plane-crashes


49 posted on 04/30/2018 7:26:41 AM PDT by pabianice (LINE)
[ Post Reply | Private Reply | To 1 | View Replies ]

Free Republic
Browse · Search
General/Chat
Topics · Post Article


FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson