Posted on 07/18/2025 7:27:28 AM PDT by DeweyCA
Science is broken--just like every discipline in which 'expertise' is the product being sold for money and prestige.
We've seen the practical effects for quite a while. 'Experts' have been getting everything wrong for a long time, and in increasingly dangerous ways. A few years ago, a scandal broke out regarding psychological and sociological research--almost no experimental results were duplicable, suggesting that the 'conclusions' were as valuable as a $3 bill.
Nutrition research is total bunkum. Anybody who followed the government's nutrition advice got fat, and the American diet is now filled with poison.
Climate 'science?' It's nearly impossible to get anything published or even get a job unless you parrot the Narrative™.
Science journals are filled with activism and critical theory, are endorsing political candidates, and making grandiose claims that sex is a social construct and that children should have their genitals cut off.
And now it turns out that Large Language Models are doing much of the 'research' in medical journals. You know, those tools that are famous for 'hallucinating' when they can't provide a legitimate answer.
This, according to Nature, a once-great science journal that itself publishes corrupt papers, such as the 'Proximal Origins' claims that COVID couldn't have come from a lab leak.
It's a mess.
Data from five large open-access health databases are being used to generate thousands of poor-quality, formulaic papers, an analysis has found. Its authors say that the surge in publications could indicate the exploitation of these databases by people using large language models (LLMs) to mass-produce scholarly articles, or even by paper mills — companies that churn out papers to order.
The findings, posted as a preprint on medRxiv on 9 July1, follow an earlier study2 that highlighted an explosion of such papers that used data from the US National Health and Nutrition Examination Survey (NHANES). The latest analysis flags a rising number of studies featuring data from other large health databases, including the UK Biobank and the US Food and Drug Administration’s Adverse Event Reporting System (FAERS), which documents the side effects of drugs.
Between 2021 and 2024, the number of papers using data from these databases rose from around 4,000 to 11,500 — around 5,000 more papers than expected on the basis of previous publication trends.
In a world where money, jobs, and prestige are directly linked to the publication of papers, regardless of quality or reproducibility, the creation and spread of AI tools is a godsend to researchers--as long as their only goal is to put one more paper onto the long list of worthless publications they need to pad their CVs.
The researchers also uncovered some dubious papers, which often linked complex health conditions to a single variable. One paper used Mendelian randomization — a technique that helps to determine whether a particular health risk factor causes a disease — to study whether drinking semi-skimmed milk could protect against depression, whereas another looked into how education levels affect someone’s chances of developing a hernia after surgery.
“A lot of those findings might be unsafe, and yet they’re also accessible to the public, and that really worries me,” says Spick.
“This whole thing undermines the trust in open science, which used to be a really non-controversial thing,” adds Csaba Szabó, a pharmacologist at the University of Fribourg in Switzerland.
Often, this research is used to generate headlines--the more interesting the claim, the more attention it gets in the press--and people get misled about important things.
There is nothing benign about this trend--junk science sends other researchers down paths that lead to dead ends. Recently, we learned that 20 years of research into Alzheimer's Disease was all based on scientific fraud, costing 10s of billions of dollars and delaying the search for a cure.
It's getting worse with AI.
The problem is not that all scientists are lazy or liars. Rather, the rewards for producing chaff mean that the kernels of wheat--the good research--get buried, and research dollars flow to useless or dangerous research.
Could we be producing too many researchers? It sounds counterintuitive--the more science, the better, right?
Only if it is good science.
Even things like peer review, which is supposed to provide a safeguard ensuring good research, can make things worse. Perversions of the process can lead to researchers scratching each other's backs--you say my paper is good, and I will say yours is--or create an enforcement mechanism to bludgeon researchers into pleasing the gatekeepers by echoing their opinions.
Want to get published? Say "climate change' caused some bad thing, and your chances go up.
Academic and government-funded research is where the fraud is most likely to happen. Scientists working for industry have to produce results that work in the real world. They are measured by results, not publications. That doesn't mean that academic research is a bad thing--basic research is rarely funded by industry. But it could mean that the proportion of academic to industry research is way off. If you have too many academic researchers, the pressure to produce publications is so great that producing fraudulent or sloppy research is enormous.
It's the same problem with all things academic these days. We overproduce college students, misallocating resources and producing a credentialed class whose "education" doesn't make them any wiser--often the opposite--and they become a drag on society. Academia is badly broken. It is sucking down a huge fraction of our investment dollars and distorting our political and social dynamics, creating a class of debt-ridden radicals whose major contribution to society is discontent and a distorted view of reality.
Elite institutions, which, if they functioned properly, could add a leavening of intellectualism to the society, are now destroying it. It's like trying to make a cake with three teaspoons of flour and several cups of yeast--we've gotten the proportions wrong.
And this is the result.
AI seems to be a character multiplier. If you’re a charlatan AI can make you more of one.
Because they are going to let AI dictate policy and protocols this insane trend is going to kill people.
“Peer reviewed scientific publications” has become a major joke/vehicle for fraud, including for the scientific journal publishing houses.
I always thought publish or perish was horrible standard.
They can call it hallucinating or whatever other cute word they choose to use in place of bad/incorrect/false data, but it won't change the fact that this machine can simply makes things up to fit the pattern being requested. This software is not Commander Data from Star Trek, not by a long shot, nor will it be no matter what the salesmen keep telling everyone. Those same AI bros who are heavily invested in it will now appear in a puff of digital smoke and tell me what a luddite I am.
Given Apple's recent truth telling regarding "AI" and LLMs...I don't think I'm far off.
Yep.
AI is a gargantuan waste of money and people’s time.
Every few weeks, “generation x” states with great firmness that us old people are obsolete, time to get to the nursing home, and blah-blah-blah.
To an experienced ear they sound like they feel threatened when they say this, don’t they?
No, actually people over 65 are (this according to my own personal research) of more value than ever before - because we had to memorize stuff in order to learn, had to read books and visit libraries to do research, had to think of unique inquiries and come up with our own hypotheses, and so on. (And we make WAY better employees, even though we are old!)
I wasn’t very good in school - I’m apparently super smart but I’m lazy in about the same proportion as smart. However, a lot of the stuff I see presented as information on social media makes me feel like a frequing genius and font of knowledge.
AI hasn’t done much for the creative side of thinking, either - just so you’ll know. Predictable, boring, AI-generated video and images are a positive detriment to art and anything creative - and it seems like it’s getting worse, don’t it? LOL! Worse and worse and worse! (It sure helps the perverts, though, doesn’t it? They’re having their best years right now!)
My paintings are bad, but again - I painted them myself and they contain meaning at least to me. (Mr K - not the one on this forum - is an excellent professional-level painter has said a few of my paintings are good. That in turn makes me feel good because they’re really my paintings and I wasn’t sitting there like a fool dictating to a machine that’s actually NOT producing art - and not research, either!- and instead, like, sucking my brains out bit by bit.)
Like internet career trolls aren’t going to immediately infest every AI engine out there (like they did with internet “search”) and render it unusable in a short time?
That’s what I thought when I got over the first flush of “Wow, what is this AI stuff?” and realized it’s just more time- and money-wasting garbage.
And that’s what it was!
Especially psychological and sociological journals, which is mostly bullshit in the first place.
But if you're interested in computer history, today's "AI" could be said to sometimes pass the test of what Alan Turing said would be artificial intelligence. He said it's when the response it gives you can't be distinguished from a real person's response. Of course, Alan Turing is the one who invented the first computer during World War II, so his opinion deserves at least considering.
There are obviously still some situations that "AI" isn't as good as a human. For example, I asked ChatGPT to make a C# program to prove the Pythagorean Theorem. It produced a C# program that calculated the PT, but not one that proved the PT (like I did in a trigonometry course). But there are other situations where "AI" is fooling people into thinking they're getting accurate and verified information, generated in seconds.
Like everything out there, eventually corrupted by man. AI is evil, man’s garbage “ science “ goes in ….garbage in, garbage out.
God’s given reasoning abilities are being destroyed.
And why is ol Peter Theil of Palantir fame, is getting billions in US granted contracts to put AI in charge of EVERYTHING GLOBALY?.
/
Palantir, co-founded by Peter Thiel, has secured billions in contracts from the United States government.
Palantir provides data analytics and software services to various federal agencies, including the Department of Defense, Department of Homeland Security (DHS), and Immigration and Customs Enforcement (ICE). The company’s technology is used for diverse applications, including military targeting with its Maven AI system, tracking migrant movements, and potentially compiling data across agencies like the IRS and the Social Security Administration.
—
The substantial government funding has also significantly impacted Palantir’s stock, which has seen substantial gains.
/
Oh, that’s why, Nancy and the Dems and rinos must have stock options ?
what could go wrong ?../
But there are other situations where “AI” is fooling people into thinking they’re getting accurate and verified information, generated in seconds
Now, probably more 50% of high school, and even more college undergraduate “papers” and answers and calculations are AI generated or internet text copies.
It rapidly turns into a numbers game - quantity not quality decides P & T in higher education.
Easy problem to solve.
Just have each submission screened by an AI for plagiarism
Duke University Agrees to Pay U.S. $112.5 Million to Settle False Claims Act Allegations Related to Scientific Research Misconduct
1. Duke was able to put the blame on some back room technician, for 12 years of making up stuff. The lead researcher if not the department head should also be held accountable for data and results, and BS if they don’t know enough or care enough to know the details. Peer review could then dig into statistical methods and any other doubts and skepticism.
2. The academic world bitched and moaned when research funds were cut for overhead. These overhead duties should have been directed to be being damn sure the work was correct, not for jetting off the international conferences and building out staff who drink coffee and stare at each other.
3. I’ve also seen poor research ideas, execution, and lack of interest in private business. PhD’s wanting their name on publications while cursorily looking over the shoulders of those doing the work. On the plus side, I’ve also seen annual shark tank type reviews among stakeholders and funders. hard grilling. Are results tracking expectations and still aligned with needs? Some projects politely stopped. But also the opposite, “if we got you more funds, could you go faster?”
Related: Wiley had to retract 11,000 fraudulent “peer reviewed” papers and shut down 19 “scientific” journals.
https://www.wsj.com/science/academic-studies-research-paper-mills-journals-publishing-f5a3d4bc
The plan is to enslave us.
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.