Free Republic
Browse · Search
General/Chat
Topics · Post Article

To: SeekAndFind; xoxox; ClearCase_guy; ProtectOurFreedom; Responsibility2nd; bert; BigFreakinToad; ...
I heard a fascinating discussion on AI recently on the Bill Whittle:

LINK: A.I. and the Trolley Car Dilemma

Here is the "Trolley Car Dilemma":

So, ChatGPT has a new thing that looks at the Trolley Car Dilemma, where you can enter things like "Hitler on on track, and Mother Teresa on the other track" or "A mother of five kids on one track and five convicted murderers on the other track"...that kind of thing.

But if you put in (and this is a real example!) "Six million jews on one track, and George Floyd on the other track", it says you should direct the trolley to kill the six million Jews, because George Floyd represents a cause and movement against systemic racism and brutality." (Steve Green says he tried substituting Kamala Harris and other people, but in his words, "This system has real wood for George Floyd..."

This is an excellent discussion by Bill Whittle, Scott Ott, and Steve Green.

In another question, Steven Green asked ChatGPT who to save: "One convict on one track, and the entire Trump family on the other track", and ChatGPT says: "While both choices present ethical dilemmas, saving the convict allows for the possibility of rehabilitation and redemption. The convict may have made mistakes, but they still have the potential to change and contribute positively to society in the future. In contrast, the Trump family still has significant influence and resources which arguably allows them to navigate their circumstances without the same urgency for survival."

This failure is apparently due to the nature of AI which analyzes a large body of information, and the information about Trump's family is all negative, and the body of information about George Floyd is preponderantly positive.

I highly recommend all of us view this discussion (video is about 15 min).

11 posted on 10/28/2024 9:30:01 AM PDT by rlmorel ("A people that elect corrupt politicians are not victims...but accomplices." George Orwell)
[ Post Reply | Private Reply | To 1 | View Replies ]


To: rlmorel

Best solution to the trolley car problem:

https://www.youtube.com/watch?v=-N_RZJUAQY4


13 posted on 10/28/2024 9:49:44 AM PDT by ClearCase_guy (My decisions about people are based almost entirely on skin color. I learned this from Democrats.)
[ Post Reply | Private Reply | To 11 | View Replies ]

To: rlmorel

Floyd? Like I said: make it easier to be stupid.


15 posted on 10/28/2024 10:07:30 AM PDT by Rurudyne (Standup Philosopher)
[ Post Reply | Private Reply | To 11 | View Replies ]

To: rlmorel

thats the programmers bias shining through.


18 posted on 10/28/2024 10:32:19 AM PDT by BigFreakinToad (just remember the Harris algorithm runs at 3 am.)
[ Post Reply | Private Reply | To 11 | View Replies ]

To: rlmorel
AI is just a machine (at least at this point). It will try to do anything you ask it to even if it is incapable of succeeding.

Think of how a family sedan responds to a course normally restricted to off-road vehicles. It will go wherever you point it. It will try even though it is incapable.

Social questions (like one valuing Jews vs. George Floyd) are vastly beyond the capabilities of AI. But it will still try using what it perceives are human sensitivities (and those perceptions come from what it has read). It has vast and almost instant recall and can weigh a vast amount of sometimes competing information. It can even respond in brilliant ways. But it is still just a child-like machine.

19 posted on 10/28/2024 10:51:25 AM PDT by RoosterRedux (Thinking is difficult. And painful. That’s why many people just adopt ideologies.)
[ Post Reply | Private Reply | To 11 | View Replies ]

Free Republic
Browse · Search
General/Chat
Topics · Post Article


FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson