Free Republic
Browse · Search
News/Activism
Topics · Post Article

To: conservatism_IS_compassion

My favorite anecdote about AI:

The Army was using AI to detect camouflaged tanks. They used satellite imagery from Germany to train the neural network. The AI got to 95% effective.

In Desert Storm, they fed similar imagery from Iraq, and the AI completely failed.

The AI was counting leaves in the German pictures, which was highly correlated to a hidden tank being corrected. In Iraq, it found to few leaves to count.

I stand by my assertion that AI necessarily means not really understanding how the machine makes decisions - otherwise it wouldn’t be AI. I do not want to fly on an aircraft governed by AI - and I am in the business of approving software for use on airplanes.


69 posted on 03/19/2019 10:50:26 AM PDT by MortMan (Americans are a people increasingly separated by our connectivity.)
[ Post Reply | Private Reply | To 68 | View Replies ]


To: MortMan
The AI was counting leaves in the German pictures, which was highly correlated to a hidden tank being corrected. In Iraq, it found to few leaves to count.
The first problem was that the AI did not say in English that it was counting leaves.

And the more general problem is obvious: the system learned from a far-from-globally representative data set. In the case of an airplane, I would think that a good approximation of a globally representative data set could be constructed. An awful lot of flights are in the air as you read this, and they generate a lot of data.

And after all, simulators can generate all the data you need - and if the AI system gets bad skinny from the simulator, so do all the human pilots who train on them.

But for a system to be smart enough to be competent to fly the plane, it would need a lot of anomaly data - from broken sensors in the real world, and (simulated, presumably) data on behavior of aircraft with all manner of possible degradations/failures of the controls.

Back in the piston-engined airliner days (but post-WWII) there was a story in the Reader’s Digest about a flight which lost its tail in a collision, and dived toward the ground. I was astonished to read that the pilot was able to crash-land the plane by the expedient of firewalling the throttles to lift the nose.

Seems like the least instinctive thing you could do under the circumstances, but then - he had no other pitch control, and it (nearly) worked. Well enough that there were a lot of survivors. If an AI system had had that couth built in, and had firewalled the throttles quicker, the plane might have pulled out AGL instead of slightly below it.


70 posted on 03/19/2019 11:37:33 AM PDT by conservatism_IS_compassion (Socialism is cynicism directed towards society and - correspondingly - naivete towards government.)
[ Post Reply | Private Reply | To 69 | View Replies ]

Free Republic
Browse · Search
News/Activism
Topics · Post Article


FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson