Posted on 04/29/2018 4:09:18 PM PDT by BBell
Tesla owner who turned on car's autopilot then sat in passenger seat while travelling on the M1 banned from driving
A man who switched on his car's autopilot before moving to the passenger seat while travelling along a motorway has been banned from driving for 18 months.
Bhavesh Patel, aged 39, of Alfreton Road, Nottingham, pleaded guilty to dangerous driving at St Albans Crown Court on Friday, April 20.
The court heard that at 7.40pm on May 21, 2017, Patel was driving his white Tesla S 60 along the northbound carriageway of the M1, between junctions 8 and 9 near Hemel Hempstead.
While the £70,000 car was in motion, he chose to switch on the supercar's autopilot function before moving across to the passenger seat and leaving the steering wheel and foot controls completely unmanned.
A witness noticed Patel, who had owned the car for a maximum of five months at the time of the incident, sat in the passenger seat of the vehicle.
No one could be seen in the driver's seat and Patel appeared to have his hands behind his head. The witness, who was a passenger in another car, filmed Patel as the car drove past.
Witness accounts stated that traffic was heavy due to congestion and it has been estimated that the vehicle was travelling at approximately 40mph at the time.
Footage of the incident was first posted on social media before it was reported to police and a Notice of Intended Prosecution was then sent to Patel in the post.
He was later interviewed by officers at Stevenage Police Station, where he admitted that he knew what he had done was 'silly', but that the car was capable of something 'amazing' and that he was just the 'unlucky one who got caught'.
(Excerpt) Read more at telegraph.co.uk ...
Short video at site someone took of this idiot.
Winnebago winner?
"This year's runaway First Place Stella Award winner was Mrs. Merv Grazinski , of Oklahoma City, who purchased a new 32-foot Winnebago motor home. On her first trip home, from an OU football game, having driven on to the freeway, she set the cruise control at 70 mph and calmly left the driver's seat to go to the back of the Winnebago to make herself a sandwich.
"Not surprisingly, the motor home left the freeway, crashed and overturned. Also not surprisingly, Mrs. Grazinski sued Winnebago for not putting in the owner's manual that she couldn't actually leave the driver's seat while the cruise control was set.
"The Oklahoma jury awarded her are you sitting down? $1,750,000 plus a new motor home. Winnebago actually changed their manuals as a result of this suit, just in case Mrs. Grazinski has any relatives who might also buy a motor home."
He could just claim cultural necessity...If he’s a raghead...
He could take the Hillary defense and say he didn’t “intend” to commit a crime. Naw, that only works for Hillary.
I’m quite sure that story’s an urban legend.
Starting to look like overconfidence in the systems is going to be one of the biggest initial hurdles.
I guess the witness were US tourist.
Can't picture a Brit used mph vs kph.
4 fatalities:
Handan, China (January 20, 2016)
On January 20, 2016, the driver of a Tesla Model S in Handan, China was killed when their car crashed into a stationary truck.[80] The Tesla was following a car in the far left lane of a multi-lane highway; the car in front moved to the right lane to avoid a truck stopped on the left shoulder, and the Tesla, which the driver’s father believes was in Autopilot mode, did not slow before colliding with the stopped truck.[81] According to footage captured by a dashboard camera, the stationary street sweeper on the left side of the expressway partially extended into the far left lane, and the driver did not appear to respond to the unexpected obstacle.[82]
In September 2016, the media reported the driver’s family had filed a lawsuit in July against the Tesla dealer who sold the car.[83] The family’s lawyer stated the suit was intended “to let the public know that self-driving technology has some defects. We are hoping Tesla, when marketing its products, will be more cautious. Dont just use self-driving as a selling point for young people.”[81] Tesla released a statement which said they “have no way of knowing whether or not Autopilot was engaged at the time of the crash” since the car telemetry could not be retrieved remotely due to damage caused by the crash.[81] Telemetry was recorded locally to a SD card and given to Tesla, who decoded it and provided that data to a third party for independent review. Tesla added that “while the third-party appraisal is not yet complete, we have no reason to believe that Autopilot on this vehicle ever functioned other than as designed.”[84]
Williston, Florida (May 7, 2016)
The first known fatal accident involving a Tesla engaged in Autopilot mode took place in Williston, Florida, on May 7, 2016. The driver was killed in a crash with a 18-wheel tractor-trailer.
By late June 2016, the U.S. National Highway Traffic Safety Administration (NHTSA) opened a formal investigation into the accident, working with the Florida Highway Patrol. According to the NHTSA, preliminary reports indicate the crash occurred when the tractor-trailer made a left turn in front of the Tesla at an intersection on a non-controlled access highway, and the car failed to apply the brakes. The car continued to travel after passing under the trucks trailer.[85][86][87] The diagnostic log of the Tesla indicated it was traveling at a speed of 74 mi/h (119 km/h) when it collided with and traveled under the trailer, which was not equipped with a side underrun protection system.[88]:12 The underride collision sheared off the Tesla’s glasshouse, destroying everything above the beltline, and caused fatal injuries to the driver.[88]:67; 13 Approximately nine seconds after colliding with the trailer, the Tesla traveled another 886.5 feet (270.2 m) and came to rest after colliding with two chain-link fences and a utility pole.[88]:7; 12
The NHTSA’s preliminary evaluation was opened to examine the design and performance of any automated driving systems in use at the time of the crash, which involves a population of an estimated 25,000 Model S cars.[89] On July 8, 2016, the NHTSA requested Tesla Inc. to hand over to the agency detailed information about the design, operation and testing of its Autopilot technology. The agency also requested details of all design changes and updates to Autopilot since its introduction, and Tesla’s planned updates scheduled for the next four months.[90]
According to Tesla, “neither autopilot nor the driver noticed the white side of the tractor-trailer against a brightly lit sky, so the brake was not applied.” The car attempted to drive full speed under the trailer, “with the bottom of the trailer impacting the windshield of the Model S.” Tesla also stated that this was Teslas first known Autopilot-related death in over 130 million miles (208 million km) driven by its customers while Autopilot was activated. According to Tesla there is a fatality every 94 million miles (150 million km) among all type of vehicles in the U.S.[85][86][91] It is estimated that billions of miles will need to be traveled before Tesla Autopilot can claim to be safer than humans with statistical significance (although fewer than billions of miles will be needed if Tesla Autopilot is more dangerous). Researchers say that Tesla and others need to release more data on the limitations and performance of automated driving systems if self-driving cars are to become safe and understood enough for mass market use.[92][93]
The truck’s driver told the Associated Press that he could hear a Harry Potter movie playing in the crashed car, and said the car was driving so quickly that “he went so fast through my trailer I didn’t see him.” “It was still playing when he died and snapped a telephone pole a quarter mile down the road.” According to the Florida Highway Patrol, they found in the wreckage an aftermarket portable DVD player. It is not possible to watch videos on the Model S touchscreen display.[87][94] A laptop computer was recovered during the post-crash examination of the wreck, along with an adjustable vehicle laptop mount attached to the front passenger’s seat frame. The NHTSA concluded the laptop was probably mounted and the driver may have been distracted at the time of the crash.[88]:1719; 21
Dr. Deb Bruce, head of the investigation team, announces results to the Board on September 12, 2017
In July 2016, the U.S. National Transportation Safety Board (NTSB) announced it had opened a formal investigation into the fatal accident while Autopilot was engaged. The NTSB is an investigative body that only has the power to make policy recommendations. An agency spokesman said, “It’s worth taking a look and seeing what we can learn from that event, so that as that automation is more widely introduced we can do it in the safest way possible.” The NTSB opens annually about 25 to 30 highway investigations.[95] In September 2017, the NTSB released its report, determining that “the probable cause of the Williston, Florida, crash was the truck drivers failure to yield the right of way to the car, combined with the car drivers inattention due to overreliance on vehicle automation, which resulted in the car drivers lack of reaction to the presence of the truck. Contributing to the car drivers overreliance on the vehicle automation was its operational design, which permitted his prolonged disengagement from the driving task and his use of the automation in ways inconsistent with guidance and warnings from the manufacturer.”[96].
In January 2017, the NHTSA Office of Defects Investigations (ODI) released a preliminary evaluation, finding that the driver in the crash had seven seconds to see the truck and identifying no defects in the Autopilot system; the ODI also found that the Tesla car crash rate dropped by 40 percent after Autosteer installation.[97][98] The NHTSA Special Crash Investigation team published its report in January 2018.[88] According to the report, for the drive leading up to the crash, the driver engaged Autopilot for 37 minutes and 26 seconds, and the system provided 13 “hands not detected” alerts, to which the driver responded after an average delay of 16 seconds.[88]:24 The report concluded “Regardless of the operational status of the Teslas ADAS technologies, the driver was still responsible for maintaining ultimate control of the vehicle. All evidence and data gathered concluded that the driver neglected to maintain complete control of the Tesla leading up to the crash.”[88]:25
Culver City, California (January 22, 2018)
On January 22, 2018, a Tesla Model S crashed into a fire truck parked on the side of the I-405 freeway in Culver City, California while traveling at a speed exceeding 50 mph (80 km/h) and the driver survived.[99] The driver said he was using Autopilot, according to the Culver City Fire Department, which reported the crash over Twitter at approximately 8:30 A.M. The fire truck and a California Highway Patrol vehicle were parked in the left emergency lane and carpool lane of the southbound 405, blocking off the scene of an earlier accident, with emergency lights flashing.[100]
Autopilot may not detect stationary vehicles at highway speeds and it cannot detect some objects.[101] Other advanced driver-assistance systems have similar limitations. Raj Rajkumar, who studies autonomous driving systems at Carnegie Mellon University, believes the radars used for Autopilot are designed to detect moving objects, but are “not very good in detecting stationary objects”.[102] Both NTSB and NHTSA have dispatched teams to investigate the crash.[103] Hod Lipson, director of Columbia University’s Creative Machines Lab, faulted the diffusion of responsibility concept: “If you give the same responsibility to two people, they each will feel safe to drop the ball. Nobody has to be 100%, and that’s a dangerous thing.”[104]
Mountain View, California (March 23, 2018)
On March 23, 2018, a second US Autopilot fatality occurred in Mountain View, California.[105] The crash occurred just before 9:30 A.M. on southbound US 101 at the carpool lane exit for southbound Highway 85, at a concrete barrier where the left-hand offramp separates from 101. After the Model X crashed into the narrow concrete barrier, it was struck again by two following vehicles, and then it caught on fire.[106]
Both the NHTSA and NTSB are investigating the March 2018 crash.[107] Another driver of a Model S demonstrated that Autopilot appeared to be confused by the road stripes in April 2018. The gore ahead of the barrier is marked by diverging solid white lines (a vee-shape); the Autosteer feature of the Model S appeared to mistakenly use the left-side white line instead of the right-side white line as the lane marking for the far left lane, which would have led the Model S into the same concrete barrier had the driver not taken control.[108] Ars Technica concluded “that as Autopilot gets better, drivers could become increasingly complacent and pay less and less attention to the road.”[109]
In a corporate blog post, Tesla noted the impact attenuator separating the offramp from US 101 had been previously crushed and not replaced prior to the Model X crash on March 23.[105][110] The post also stated that Autopilot was engaged at the time of the crash, and the driver’s hands had not been detected manipulating the steering wheel for six seconds before the crash. Vehicle data showed the driver had five seconds and 150 metres (490 ft) “unobstructed view of the concrete divider, [...] but the vehicle logs show that no action was taken.”[105] The NTSB investigation had been focused on the damaged impact attenuator and the vehicle fire after the collision, but after it was reported the driver had complained about the Autopilot functionality,[111] the NTSB announced it would also investigate “all aspects of this crash including the drivers previous concerns about the autopilot.”[112] A NTSB spokesman stated the organization “is unhappy with the release of investigative information by Tesla”.[113] Elon Musk dismissed the criticism, tweeting that NTSB was “an advisory body” and that “Tesla releases critical crash data affecting public safety immediately & always will. To do otherwise would be unsafe.”[114]
The article was written for American readers. ;-)
Patel - probably Hindoo.
Yes, but even if it is, the fact that we can easily believe a court would award $1,750,000 plus a new motor home speaks volumes about the state of our judicial system.
He obviously identifies as a passenger and not a driver.
Probably born that way and it should be covered by governmental freebies.
I’m sure that story is an urban legend but I often wonder why RVs are not involved in more accidents.
which is worse, driver who turns on AutoPilot and goes to sleep...... or driver who turns on Cocaine, Meth, Weed, and Heroin, drinks a few litres of whisky, and stays behind the wheel?
I feel like a jackass.
Story gave value of 70k in British units.
Why? He wasn't driving. Your Honor.
They do also use the metric system but like their old system. After all, they created it.
I'm a jackass I know...
On the other hand, the portion posted doesn’t prohibit him from doing it again.
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.