Posted on 10/25/2022 1:04:33 PM PDT by Red Badger
Scientists can now "decode" people's thoughts without even touching their heads, The Scientist reported.
Past mind-reading techniques relied on implanting electrodes deep in peoples' brains. The new method, described in a report posted 29 Sept. to the preprint database bioRxiv, instead relies on a noninvasive brain scanning technique called functional magnetic resonance imaging (fMRI).
fMRI tracks the flow of oxygenated blood through the brain, and because active brain cells need more energy and oxygen, this information provides an indirect measure of brain activity.
By its nature, this scanning method cannot capture real-time brain activity, since the electrical signals released by brain cells move much more quickly than blood moves through the brain.
But remarkably, the study authors found that they could still use this imperfect proxy measure to decode the semantic meaning of people's thoughts, although they couldn't produce word-for-word translations.
"If you had asked any cognitive neuroscientist in the world 20 years ago if this was doable, they would have laughed you out of the room," senior author Alexander Huth, a neuroscientist at the University of Texas at Austin, told The Scientist.
For the new study, which has not yet been peer-reviewed, the team scanned the brains of one woman and two men in their 20s and 30s. Each participant listened to 16 total hours of different podcasts and radio shows over several sessions in the scanner.
The team then fed these scans to a computer algorithm that they called a "decoder," which compared patterns in the audio to patterns in the recorded brain activity.
The algorithm could then take an fMRI recording and generate a story based on its content, and that story would match the original plot of the podcast or radio show "pretty well," Huth told The Scientist.
In other words, the decoder could infer what story each participant had heard based on their brain activity.
That said, the algorithm did make some mistakes, like switching up characters' pronouns and the use of the first and third person. It "knows what's happening pretty accurately, but not who is doing the things," Huth said.
In additional tests, the algorithm could fairly accurately explain the plot of a silent movie that the participants watched in the scanner. It could even retell a story that the participants imagined telling in their heads.
In the long term, the research team aims to develop this technology so that it can be used in brain-computer interfaces designed for people who cannot speak or type.
Read more about the new decoder algorithm in The Scientist.
“Big Brother is Watching You.”
― George Orwell, 1984
“Nothing was your own except the few cubic centimetres inside your skull. ”
― George Orwell, 1984
Cool, now the FIB can prosecute Americans for Thought Crime!
It’s a simple process. Monitor what most people are being told on TV. You now have the key to decoding their thoughts.
Most likely only if you took the vax.
https://www.youtube.com/watch?v=SR5BfQ4rEqQ From Back to the Future.
“In the long term, the research team aims to develop this technology so that it can be used in brain-computer interfaces designed for people who cannot speak or type.”
Yeah, right after the establishment of the Precrime bureau.
Unless I misunderstand, the process still relies on strapping a person into an MRI machine..
studying nonverbal behavior is as accurate if you know what you are looking for I would hypothesize...
telepathy is the communication of the angels, not mortals.
The study scanned the brains of one woman and two men in their 20’s, thirty’s.
I saw a video of Dr. Bateman running a study similar to this.
I spent decades with The Band Played On running through my brain. Looks like I will have reboot that da** song to run as background again.
It doesn’t work on Democrats.
I proposed this as my doctoral thesis 25 years ago; my thesis advisor thought it was too risky to get funded.
Yep. Ethernet connection to your body. First it was smart meters hooked to your phone lines for reading meters and collecting statistics. Then it was installing a latching control relay to govern your service. Then it was installing LAN transmitters to wirelessly govern your service and latch relay.
Now they use your Metadata to score your life. Next they monitor your vitals and movements. Next they monitor your thoughts. You carry around the wireless transponder with you like a trained dog. Invisible social fencing.
Now it will be too risky to be in the public’s hands................
For NonVerbal Communication, there is a reasonable certainty that bilateral raised middle fingers is sending a clear message.
Strange yet entertaining movie.
So what am I thinking about your new study?
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.