Free Republic
Browse · Search
General/Chat
Topics · Post Article

To: libertylover

“The car expected ALL tractor-trailers to be painted white?”

What the author of this article should have explained is that the software mkstakenly concluded that the cameras were overexposed due to sunlight bombarding the camera’s sensors. The data you’d receive in that situation is similar to viewing “all white”.

Also, if the car’s cameras were too close, the system wouldn’t be able to perceive depth ... It also would be able to perceive depth if there wasn’t much texture on the truck (sounds like this was the case since the sides were plain white).

What bothers me is that this “blind” condition should have automatically triggered a vehicle slow down and alert the driver to take over (duh! Really? ... :-) ) This exact condition I described is one that should be tested to hell and back and then some ... It’s a well known corner case test.

I develop image processing applications for a living. A project I started to work on in 2010 is just now being deployed. The testing it went through was quite rigorous. Even now, the product is in limited release and won’t be in general release for a couple more years. We didn’t have a 10th of the variables that Tesla had when developing automated driving.

I dont care how brilliant their engineers are ... there is no way they’ve done enough real world testing if the information in this article is true and a crash occurs under these conditions. Testing is *everything* when it comes to autonomous vehicles. Sadly, that’s where corners are usually cut these days.


21 posted on 06/30/2016 10:10:42 PM PDT by edh (I need a better tagline)
[ Post Reply | Private Reply | To 5 | View Replies ]


To: edh

Car went into a trailer. Head gone.


24 posted on 06/30/2016 10:36:25 PM PDT by Dr. Bogus Pachysandra (Don't touch that thing Don't let anybody touch that thing!I'm a Doctor and I won't touch that thing!)
[ Post Reply | Private Reply | To 21 | View Replies ]

To: edh

They are doing testing. In the real world as we speak...

Every second these guinea pigs are on the road, the AI is improved via “deep learning”. The machines simply observe thousands of human drivers and learn how to drive collectively.

Of course they give the caveat that the system is new and you should keep your hands on the wheel (due to new rare occurances like this). Then when the driver grabs the wheel, all the cars learn how to handle that new situation.

I was given a preview of this type of software about a year ago and even back then it felt incredible.

IMO, the biggest issue now is people becoming complacent because the system seems so amazing in 99.99% of driving situations. Even if watching and holding the wheel, your mind relaxes and you become unprepared to take over for that 0.01% chance event. I felt that “complacent” feeling even on the first ride.

This guy might have logged thousands of miles without a single error and eventually got so complacent he watched movies and even fell asleep


26 posted on 06/30/2016 10:48:10 PM PDT by varyouga
[ Post Reply | Private Reply | To 21 | View Replies ]

Free Republic
Browse · Search
General/Chat
Topics · Post Article


FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson