Posted on 01/26/2023 9:10:17 PM PST by SeekAndFind
Does this mean that all of the writers at Hallmark are no longer needed?
i saw another news article on ChatGPT passes MBA exam given by a Wharton professor. and also Professor Jonathan Choi, of the Minnesota University Law School, gave ChatGPT the same test faced by students, consisting of 95 multiple-choice questions and 12 essay questions.
In a white paper titled ChatGPT Goes To Law School published on Monday, he and his co-authors reported that the bot scored a C+ overall.
maybe i should use chatgpt to post freerepublic for me /s
I understand your point.
I am not a data scientist, but if this technology can be created, similar technology can discern between human-created content and machine-generated content. Perhaps not technically ‘steganography’ but something similar.
A previous poster mentioned using this tool to paraphrase content, then re-write it using actual research and composition. That makes sense, but would, I believe, defeat the reason that one attended college in the first place.
OpenAI guest researcher Scott Aaronson said at a December lecture that the company was working on creating watermarks for the outputs so that people could see signs of a machine-generated text.
++++++++++++++++++
It was in the original article. I Should have read further.
Did you really?
It didn’t create the response until programmed to do so by your intervention.
[I would also ask: if 2 students in the same class used it, would the answers or papers written be the same?]
The problem is our younger generations haven’t just collapsed academically: They’re squat ethically, too.
Bkmk
and yet, we have The Plagiarist In Chief residing in The White House.
AND...
we have a National Holiday honoring a Whitewashed Character/Philanderer/Communist-Sympathizer who plagiarized 60% of what he wrote and said ~ Michael King
Ummm - the AI goes out and gets info from what’s already written - I’d guess that since the internet got so full of data, 95% of all papers are cribbed from others and just reworded a bit to not be exact copies.
You didn’t write it. You turned it in with your name on it
I looked up his blog, and he wrote an interesting, but long, post on how they could watermark text. The gist of it is, word sequence choice could be set in a way that would probabilistically only come from Chat GPT.
This is his explanation:
—
How does it work? For GPT, every input and output is a string of tokens, which could be words but also punctuation marks, parts of words, or more—there are about 100,000 tokens in total. At its core, GPT is constantly generating a probability distribution over the next token to generate, conditional on the string of previous tokens. After the neural net generates the distribution, the OpenAI server then actually samples a token according to that distribution—or some modified version of the distribution, depending on a parameter called “temperature.” As long as the temperature is nonzero, though, there will usually be some randomness in the choice of the next token: you could run over and over with the same prompt, and get a different completion (i.e., string of output tokens) each time.
So then to watermark, instead of selecting the next token randomly, the idea will be to select it pseudorandomly, using a cryptographic pseudorandom function, whose key is known only to OpenAI. That won’t make any detectable difference to the end user, assuming the end user can’t distinguish the pseudorandom numbers from truly random ones. But now you can choose a pseudorandom function that secretly biases a certain score—a sum over a certain function g evaluated at each n-gram (sequence of n consecutive tokens), for some small n—which score you can also compute if you know the key for this pseudorandom function.
Yeah, that.
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.