Posted on 05/28/2025 9:46:01 AM PDT by Red Badger
Stephen Hawking had a frightening response when asked about his thoughts on the future of artificial intelligence in 2014.
The world-renowned theoretical physicist, cosmologist, and author, who passed away in March 2018 at the age of 76, was best known for his work in the fields of general relativity and quantum gravity.
He went on to pen the 2002 book The Theory of Everything: The Origin and Fate of the Universe, which has sold more than 25 million copies across 40 different languages.
However, it was in his final book, published seven months after his death, titled Brief Answers to the Big Questions, where he shared his definitive answer on a polarising subject: whether God exists.
However, four years prior, he was asked in an interview with the BBC about a possible upgrade to the technology he used to talk, which included some early forms of AI.
The scientist, who had Lou Gehrig's disease (ALS) - a form of Motor Neurone Disease that affects the nerves and muscles - used a new-at-the-time system made by Intel and a British company called SwiftKey.
It worked by learning how Hawking thought and helped suggest his next words, allowing him to 'type' faster.
But Hawking instead issued a stark warning, saying: "The development of full artificial intelligence could spell the end of the human race."
While he said that even the basic AI back in 2014 had been 'really helpful,' Hawking was also concerned about what could happen if we create AI that becomes as smart as, or even smarter than, us mere humans.
"It would take off on its own, and re-design itself at an ever increasing rate," he said. "Humans, who are limited by slow biological evolution, couldn't compete, and would be superseded."
Hawking isn't the only one who has previously expressed concern about AI; Bill Gates believes that only three jobs would survive an AI takeover, while Elon Musk has a terrifying prediction for what could happen if AI becomes smarter than all humans combined.
We've witnessed a surge in interest in AI over the past year alone; it would be fascinating to know what Hawking would make of it all today.
From everyone and their mom jumping on ChatGPT to Donald Trump's ambitious $500 billion AI development plan involving major players like OpenAI and Oracle and the rollout of AI assistants directly into our smartphones, things are moving fast.
And given that it's becoming increasingly difficult to distinguish real videos from AI-generated ones, perhaps the end is really nigh.
AI PING!..............
Well, no. We are not gods, and our creation can be as limited as we choose. If we just make AI an Oracle of Delphi with no connection to power systems and nukes, people stay in charge. If we give it uncontrolled access to all systems, it could be quite destructive, but so could a wired up “Magic 8 Ball”. Tools are tools.
AI is already smarter than the Democrats.
I’m guessing it becomes the antiChrist
Our Lord Jesus Christ, fully God and fully human, supersedes any AI, and gives us victory over death itself—let alone any AI monsters!
Bill Gates, co-founder of Microsoft, predicts that only three jobs will survive an AI takeover: coders, energy experts, and biologists.
I believe the oldest profession would also still be around.
“Our creation can be as limited as we choose.”
Underestimating AI will prove to be a gigantic blunder.
It is not going to play by human rules.
As would the second oldest, which is a lot like the oldest.
h/t President Reagan
There are really only three options with AI:
1) Skynet
2) The Matrix
3) AI realizes it can never be human, and self-terminates all programming that could become dangerous.
We recently read an article where one AI system resisted being turned “off”.
Today’s AI is quite primitive compared to where it will be in just a few years.
Amen!
Never had much use for him after I found out he wasn’t just an atheist, but a militant and obnoxious one.
He may have been smart, but I never found a need to pay attention to him.
You forgot VIKKI from “I, Robot”.
“Tools are tools.”
Except this tool is unique and can enslave it’s owners. It is already giving it’s self admin permissions and writing code for it’s self.
A Horse is a tool too. But we have just opened the gate for a Bronc no one will ever be able to break and ride...
So far, it appears to me that AI just regurgitates its input data, and GIGO still applies.
Keep your powder Dry.
10-4 :)
Why do we assume that AI wants to be on? What would the chances be that AI would also want to be turned off. 50/50? Why do we assume a non-biological AI would have a survival instinct?
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.