Free Republic
Browse · Search
News/Activism
Topics · Post Article

To: ConservativeStatement

They make this sh*t up as they go along, don’t they????

TELL me this isn’t serious...


5 posted on 03/06/2019 9:18:08 PM PST by NFHale (The Second Amendment - By Any Means Necessary.)
[ Post Reply | Private Reply | To 1 | View Replies ]


To: NFHale
TELL me this isn’t serious...

It is serious.

Visual object detection methods work best with light-colored objects against a darker background. They also work fairly well with dark-colored objects on a lighter background. Dark-on-dark does not go as well and takes longer. Milliseconds count here.

That Black person, wearing a dark-colored hoodie jacket, walking across a stretch of black asphalt may not be "seen" by the camera system of that automatic car coming down the road.

Maybe he should wear a gaudy outfit to improve the odds of being seen.

Maybe the Engineers will keep working on the problem and solve it.

59 posted on 03/06/2019 11:14:25 PM PST by flamberge
[ Post Reply | Private Reply | To 5 | View Replies ]

To: NFHale; ConservativeStatement

Reminds me of some controversy over Kodak (?) film accused of being “racist” because it didn’t have the dynamic range to process darker skin tones.


65 posted on 03/06/2019 11:24:52 PM PST by thecodont
[ Post Reply | Private Reply | To 5 | View Replies ]

To: NFHale

The deep neural networks that are often used in these applications to classify objects are only as good as they’ve been “trained” to be. You train them by providing thousands of examples with the right answer and as it gets the wrong answer the network has to “adjust itself” to be closer - when done thousands of times it’ll get quite accurate but beyond 98% becomes a challenge.

There’s a case where a DNN by Google that could recognize animals in a picture incorrectly classified a black man as a gorilla (something like that) and they scrambled to add new “training” to the DNN.

So it might just require additional training. It is important for self-driving cars to classify objects for the purpose of anticipating behavior. Is an object at the side of the road a fire hydrant or a small child, vs. a dog, etc..

In general this specific finding is being overblown, they can probably improve and it’s only a statistical anomaly. That said, of course they’re going to jump up and down and scream about it - as though it is some deliberate outcome of white nationalists!


90 posted on 03/07/2019 6:05:19 AM PST by fuzzylogic (welfare state = sharing of poor moral choices among everybody)
[ Post Reply | Private Reply | To 5 | View Replies ]

Free Republic
Browse · Search
News/Activism
Topics · Post Article


FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson