Posted on 10/01/2024 8:17:32 AM PDT by MNDude
The problem was not recognizing AI-generated or AI-revised text. At the start of every semester, I had students write in class. With that baseline sample as a point of comparison, it was easy for me distinguish between my students’ writing and text generated by ChatGPT. I am also familiar with AI detectors, which purport to indicate whether something has been generated by AI. These detectors, however, are faulty. AI-assisted writing is easy to identify but hard to prove.
As a result, I found myself spending many hours grading writing that I knew was generated by AI. I noted where arguments were unsound. I pointed to weaknesses such as stylistic quirks that I knew to be common to ChatGPT (I noticed a sudden surge of phrases such as “delves into”). That is, I found myself spending more time giving feedback to AI than to my students. So I quit.
(Excerpt) Read more at time.com ...
There are a million reasons to quit the government schools.
I suppose ChatGPT is one of them.
In my English composition class, you wrote in class. Of course we didn’t have cell phones or AI, but writing under the watchful gaze of the instructor prevented any plagiarism. Home assignments were a different story.
“hard to prove”
Please go to the university website and download Keystroke Recorder to the PC you will be typing papers on.
Hmmmm.
Why are we continuing to indulge students whose first language is not English? That is a timeworn and outdated excuse in this era! Generations of immigrants from all over the world went through the same challenges and eventually learned and mastered English in the United States and other English speaking countries. It is time we stop lowering expectations and insist on higher standards.
Quick AI redo -—
This fall marks the first time in nearly 20 years that I’m not returning to the classroom. Throughout my career, I taught writing, literature, and language, primarily to university students. My decision to leave was largely influenced by the rise of large language models (LLMs) like ChatGPT.
As historian Lynn Hunt has pointed out, writing is “not the transcription of thoughts already consciously present in [the writer’s] mind.” Instead, writing is a process deeply intertwined with thinking. During my graduate studies, I spent months trying to piece together my dissertation, ultimately realizing that writing was the key to solving the puzzle. Writing is hard work; it can be intimidating. Unfortunately, many of my students were no longer willing to endure that discomfort, drawn instead to the ease of AI.
In my last position, I taught academic writing to doctoral students at a technical college. Many of my graduate students, particularly those studying computer science, were well-versed in the mechanics of generative AI. They recognized its limitations—acknowledging that LLMs can hallucinate, create false citations, and are not reliable for original research. Despite this understanding, many students still relied heavily on generative AI, with some openly admitting to using ChatGPT to draft their articles after outlining their research.
As an experienced educator, I employed best practices in teaching. I structured assignments, explored ways to integrate AI into my lessons, and designed activities to highlight its limitations. I reminded students that ChatGPT can misinterpret text, produce biased information, and often doesn’t generate high-quality writing. Still, they continued to use it.
In one class activity, students wrote a paragraph, revised it with ChatGPT, and compared the output to their original text. However, most students lacked the skills to analyze the nuances of meaning or evaluate style effectively. One PhD student even remarked, “It makes my writing look fancy,” when I pointed out weaknesses in the AI-generated text.
Students also relied heavily on paraphrasing tools like Quillbot. Effective paraphrasing requires a deep understanding of the material, and recent cases of “duplicative language” show that it’s a challenging task. It’s no surprise that many students are tempted by AI-powered paraphrasing tools, which often result in inconsistent writing styles, fail to prevent plagiarism, and allow students to bypass genuine comprehension. Such tools can only be beneficial if students already possess a solid grasp of writing.
Outsourcing their writing to AI deprives students of the chance to engage deeply with their research. Ted Chiang succinctly captures this in his article on art and generative AI: “Using ChatGPT to complete assignments is like bringing a forklift into the weight room; you will never improve your cognitive fitness that way.” The myriad decisions writers make—regarding syntax, vocabulary, and style—are as crucial as the underlying research itself.
While generative AI can serve as a democratizing tool, especially for non-native English speakers whose writing often contains grammatical errors, it can also alter vocabulary and meaning, even when the intent is simply to correct grammar. My students struggled to recognize these subtle shifts. I found it challenging to convey the importance of stylistic consistency and developing their unique voices as research writers.
The issue wasn’t identifying AI-generated text. Each semester, I asked students to write in class, allowing me to establish a baseline for comparison. I could easily distinguish between their writing and that generated by AI. While I was familiar with AI detectors that claim to identify AI-generated content, these tools are often unreliable. AI-assisted writing can be easy to spot but difficult to prove.
Consequently, I spent countless hours grading work I suspected was AI-generated. I pointed out flaws in arguments and noted stylistic quirks typical of ChatGPT, leading me to give more feedback to AI than to my students.
So I decided to quit.
The best educators will need to adapt to the presence of AI, and in many ways, this adaptation can be positive. Teachers should shift away from rote assignments and simple summaries, finding ways to foster critical thinking and help students understand that writing is a tool for generating ideas and clarifying methodologies.
However, these lessons require students to be willing to embrace the discomfort of uncertainty. They need to trust in their cognitive abilities as they write and revise. Unfortunately, with few exceptions, my students were not prepared to venture into that discomfort or stay there long enough to discover the transformative power of writing.
The students are not sophisticated enough to use AI as a tool. They just use it to cheat. In my day I had to go to the library, read, research, check the footnotes and read those footnote sources too. AI can speed that up too. But just to say “Write me a book report on Huckleberry Finn” is no good. Oh wait I forgot. That book has been “banned”.
I’m not going back to school this fall for the first time in almost 20 years. I have worked as a writing, literature, and language instructor for university students most of my career. A major factor in my decision to quit was the emergence of large language models (LLMs) such as ChatGPT.
Writing is “not the transcription of thoughts already consciously present in [the writer’s] mind,” as historian Lynn Hunt has noted. Rather, writing is a process that is closely related to thought. I struggled to put the thesis together for months during my graduate school before figuring out that composition was the key. Writing is a laborious and sometimes frightening task.
Dear Student:
I think you used an AI program to generate your paper. Please take the device you created your paper on to Dave DiskDriveChecker in the Computer Science Department.
Shouldn’t one know how to write before he becomes a doctoral student?
Collegiate systems like Turn It In score papers on a plagiarism score. If AI was creating the paper it would probably score too high to be accepted.
There are specific things doctoral candidates do when writing papers.
Students reading and reading comp is not the problem, it is they are stuck in a world where a computer delivers the commercially acceptable answer for fractions of a penny per inquiry. Always go back to orals where the right students would pass and the wrong students would not.
Oral exams will be the way to go.
If I were a teacher I would sit down with the student for 10-15 minutes and ask them questions based on what they wrote, and if they can’t answer the questions, that’s how I would find out they cheated.
There are specific things doctoral candidates do when writing papers.
Use to be research that was more often than not most expensive in tabulation (computation) , it was certainly not in the past spectrum metadata studies, that AI is sort of the end result of. If there are 1000-1 ratio findings the only paper acceptable follow the 1000. Honestly only give top degrees to people who do new work, with right or wrong conclusions.
Step two, however, is where that will fail them. A week or so later, I would give everyone in the class a test. The test would be three essay question, asking each student to expound upon a particular point in their report, asking very pointed question and expecting very thorough and thoughtful question that could only be answered if the student has a good grasp on the subject material and is able to provide a competent and thoughtful response.
If the student just used ChatGPT to phone in the report, step two would catch them out. If they fail the test in step two, then I know the student doesn't have a clue at all what they're talking about or any grasp of their subject material, and thus will receive zero for the whole thing.
Exactly.
Sure! Here’s an original message that might resonate with users on Free Republic, where conservative political views and topics like individual liberty, limited government, and traditional values are often emphasized:
Thread Title: The Fight for Freedom Never Ends – Time to Reclaim Our Country!
Fellow Patriots,
We are living in unprecedented times. Our founding fathers envisioned a nation built on individual liberty, self-reliance, and a limited government that serves the people, not rules them. Yet, everywhere we turn, we see these principles under attack. From endless government overreach to woke ideology infiltrating our schools, media, and corporations, it’s clear that the fight for freedom is far from over.
But let’s be clear – we are NOT powerless. The silent majority is awakening! More Americans are standing up against the erosion of our constitutional rights and pushing back against the globalist agenda that threatens to undermine our sovereignty. Whether it’s protecting the Second Amendment, defending our freedom of speech, or stopping the indoctrination of our children, now is the time to take action!
The 2024 election will be a turning point. This is our moment to restore America First policies, uphold law and order, and ensure that patriots – not elites – determine the future of our great nation.
Let’s stand united in this battle for the heart and soul of America. Freedom isn’t free, but together we can preserve the values that made this country a beacon of hope for the world.
Are you ready to reclaim our country?
Stay vigilant, stay vocal, and never back down!
#MAGA #AmericaFirst #PatriotsUnite #ConstitutionMatters #ResistTyranny
This post touches on themes of freedom, patriotism, and a rallying call for action, all of which are often topics of interest to users on Free Republic.
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.