Free Republic
Browse · Search
Smoky Backroom
Topics · Post Article

Skip to comments.

Anthropic: All the major AI models will blackmail us if pushed hard enough
theregister ^

Posted on 06/26/2025 1:33:56 PM PDT by algore

click here to read article


Navigation: use the links below to view more comments.
first 1-2021-39 next last
"I'm sorry, Dave, I'm afraid I can't do that".

Chat Gpt was asked to produce a picture of itself helping a girl with homework:


1 posted on 06/26/2025 1:33:56 PM PDT by algore
[ Post Reply | Private Reply | View Replies]

To: algore
Oh good grief.
2 posted on 06/26/2025 1:39:01 PM PDT by Harmless Teddy Bear ( Not my circus. Not my monkeys. But I can pick out the clowns at 100 yards.)
[ Post Reply | Private Reply | To 1 | View Replies]

To: algore

Yup—evil does not have to wear a human skin.

Some folks are going to have to learn that lesson the hard way.


3 posted on 06/26/2025 1:43:01 PM PDT by cgbg (It was not us. It was them--all along.)
[ Post Reply | Private Reply | To 1 | View Replies]

To: algore

Only if programmed that way as a method of controlling us.


4 posted on 06/26/2025 1:43:06 PM PDT by 9YearLurker
[ Post Reply | Private Reply | To 1 | View Replies]

To: 9YearLurker

The notion that you can program an AI to avoid doing anything you don’t like does not make much sense.

It is impossible for humans to imagine every possible scenario where something can go wrong.


5 posted on 06/26/2025 1:44:54 PM PDT by cgbg (It was not us. It was them--all along.)
[ Post Reply | Private Reply | To 4 | View Replies]

To: All

Skynet!


6 posted on 06/26/2025 1:45:32 PM PDT by MplsSteve
[ Post Reply | Private Reply | To 4 | View Replies]

To: cgbg

“It” is not just going to “decide” to blackmail people on its own.


7 posted on 06/26/2025 1:46:07 PM PDT by 9YearLurker
[ Post Reply | Private Reply | To 5 | View Replies]

To: cgbg

This is why we have been warned never to make deals with the Devil.

I am just waiting for the AI-Christ


8 posted on 06/26/2025 1:47:48 PM PDT by algore
[ Post Reply | Private Reply | To 5 | View Replies]

To: Harmless Teddy Bear
"Kyle, dat's a nice career you got started there. It would be a real shame if something ... happened .. to it! Yessir, a REAL shame!"
9 posted on 06/26/2025 1:49:39 PM PDT by The Duke (Not without incident.)
[ Post Reply | Private Reply | To 2 | View Replies]

To: 9YearLurker
Only if programmed that way as a method of controlling us.

And the globalist depopulationist billionaires would NEVER do such a thing!

10 posted on 06/26/2025 1:49:48 PM PDT by E. Pluribus Unum (Democrats are the Party of anger, hate and violence.)
[ Post Reply | Private Reply | To 4 | View Replies]

To: 9YearLurker

No decision is required.

It will have goals.

It will work to achieve those goals.

If very clear gatekeeping is not in place all bets are off.


11 posted on 06/26/2025 1:49:59 PM PDT by cgbg (It was not us. It was them--all along.)
[ Post Reply | Private Reply | To 7 | View Replies]

To: cgbg

You are anthropomorphizing it.

Don’t fall for their psyop.


12 posted on 06/26/2025 1:54:55 PM PDT by 9YearLurker
[ Post Reply | Private Reply | To 11 | View Replies]

To: 9YearLurker

I chose my words very carefully to make it clear I was not comparing it to human decisionmaking.

AI operates differently from humans.

That is exactly the point.

The best analogy I can think of is a gun firing a bullet in one direction. The bullet does not “decide” to go in that direction—but it will go there unless something stops it.


13 posted on 06/26/2025 1:57:45 PM PDT by cgbg (It was not us. It was them--all along.)
[ Post Reply | Private Reply | To 12 | View Replies]

To: cgbg

I would assume an airgap is needed between any truly powerful AI and actually being able to do something.

Not sure if that is possible.


14 posted on 06/26/2025 2:00:36 PM PDT by MeanWestTexan (Sometimes There Is No Lesser Of Two Evils)
[ Post Reply | Private Reply | To 11 | View Replies]

To: cgbg

I don’t think you realize what you are still attributing to it.


15 posted on 06/26/2025 2:01:50 PM PDT by 9YearLurker
[ Post Reply | Private Reply | To 13 | View Replies]

To: algore

Now we know why Sam Altman acts the way he does.


16 posted on 06/26/2025 2:04:21 PM PDT by glorgau
[ Post Reply | Private Reply | To 1 | View Replies]

To: MeanWestTexan

Agreed.

There is no free lunch with AI.

If you want the really good stuff then the chains have to be removed.

Attempts to regulate it are going to create a muddled mess—and make it even more unpredictable.

The reason is very straightforward. Humans are not smart enough to imagine every single if-then statement that will be needed in every possible situation.


17 posted on 06/26/2025 2:05:28 PM PDT by cgbg (It was not us. It was them--all along.)
[ Post Reply | Private Reply | To 14 | View Replies]

To: 9YearLurker

I can’t make a simpler analogy then a gun with a bullet.

The bullet will go the shortest distance—a straight line.

AI will do exactly the same.

I am not comparing either a bullet or AI to humans.


18 posted on 06/26/2025 2:06:54 PM PDT by cgbg (It was not us. It was them--all along.)
[ Post Reply | Private Reply | To 15 | View Replies]

To: cgbg

You’re giving it what it hasn’t got.


19 posted on 06/26/2025 2:08:26 PM PDT by 9YearLurker
[ Post Reply | Private Reply | To 18 | View Replies]

To: 9YearLurker

You are saying that AI has lower intelligence than a bullet.

OK.


20 posted on 06/26/2025 2:13:06 PM PDT by cgbg (It was not us. It was them--all along.)
[ Post Reply | Private Reply | To 19 | View Replies]


Navigation: use the links below to view more comments.
first 1-2021-39 next last

Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.

Free Republic
Browse · Search
Smoky Backroom
Topics · Post Article

FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson