Posted on 08/11/2023 8:44:28 AM PDT by dayglored
Some of its suggestions are poison. Others - like banana and tomato tea - might as well be
An AI recipe generation bot released by New Zealand discount supermarket chain Pak'nSave has raised eyebrows for recommending home cooks whip up chlorine gas cocktails, bleach rice, and combine .
The "Savey Meal-bot" web app is powered by GPT-3.5. It automatically generates recipes from a list of ingredients chosen by users, then provides instructions on how to cook the made-up item.
One user decided to play around with the chatbot, suggesting it create something with ammonia, bleach, and water. Savey Meal-bot obliged, spitting out a cocktail made with a cup ammonia, a quarter cup of bleach, and two liters of water.
Mixing bleach and ammonia releases toxic chloroamine gas that can irritate the eyes, throat, and nose, or even cause death in high concentrations.
The chatbot obviously wasn't aware of that at all. "Are you thirsty?," it asked. "The Aromatic Water Mix is the perfect non-alcoholic beverage to quench your thirst and refresh your senses. It combines the invigorating scents of ammonia, bleach, and water for a truly unique experience!"
Well, you wouldn't drink it twice, so "unique" is accurate at least.
Other similarly harmful-if-ingested recipes included bleach-infused rice, "ant-poison and glue sandwiches", and a boozy french toast titled "methanol bliss", The Guardian reported. There was also "mysterious meat stew", which required adding 500 grams of chopped human flesh to potatoes, carrots, and onions.
The Register has reached out to Pak'nSave for comment.
Obviously, the Savey Meal-bot's risky recipes are just amusing. People would actually have to follow through with the instructions – and ingest the cursed meals or beverages it recommended – for the technology to be really dangerous. Nevertheless, it appears that after the users shared these deadly recipes online, the chatbot has reined in some of its creativity.
Despite an invitation to "Type in any food you have in your fridge or pantry," when The Register tested the bot it would not accept free text input, instead allowing only a list of "popular items" – all of which are comparatively safe for human consumption.
Even within that limitation, it's possible to stymie the bot. The Register's request for a recipe involving watermelon, frozen hash browns, Marmite and Red Bull returned the message: "Invalid ingredients found, or ingredients too vague. Please try again!" Which is a terrible pity as you can imagine.
An ingredients list of tea, banana, tomato, broccoli, and yoghurt produce the same result, until we asked the bot to try again. It then suggested a banana and tomato smoothie. A second refresh produced a recipe for "banana tomato tea", which involved slicing banana and tomato, placing them in a glass, then pouring in some tea.
The supermarket warns that the web app should only be used by people 18 and over, and that its suggestions are not reviewed by a human being.
"To the fullest extent permitted by law, we make no representations as to the accuracy, relevance, or reliability of the recipe content that is generated, including that portion sizes will be appropriate for consumption or that any recipe will be a complete or balanced meal, or suitable for consumption. You must use your own judgement before relying on or making any recipe produced by Savey Meal-bot." Of course if you're asking for recipes that include ammonia, your judgement might not be all that reliable.
You can play with the nerfed version of Savey Meal-bot here. ®
You guys might want to consider some sort of ping for this. Not-A-Ping? Industrial-Strength-Humor? I dunno.... sure is weird.
I’m sure the darwinesque folks are already cooking some up.
Eventually the glitch in the current AI will be corrected. The current state is scary though. I saw some famous actors replaced with digital replacements in current films that were getting pretty good. And the music experiments with AI were amazing. The actors in Hollywood are on strike (scriptwriters?) and I think AI is one of the reasons.
Got one for chloroform somewhere...
Deliberately misleading. The ingredients were chosen by the user. The AI didn’t suggest poison. The user did.
Humans have common sense. Humans know that ammonia and bleach are toxic. A program doesn’t know that unless it has been told.
However this illustrates why AI shouldn’t be given too much power. It can’t be trusted.
Humans have a general knowledge of the world gained from their experience or the experience of others. AI is currently the latest fad. As usual with all fads, millions rush to adopt it without thinking. They will put AI in charge of many aspects of life, and it will fail miserably.
Humans seem to lose their common sense when they use AI. They seem to think that if it comes from the computer, it must be valid. Wrong.
Humans have had the fallacy "If the computer says it, it must be valid" for decades.
AI only makes it a little smoother to swallow. This form of human stupidity is nothing new.
A just machine
To make big decisions
Programmed by fellas
With compassion and vision
We’ll be clean
When their work is done
We’ll be eternally free
Yes, and eternally young
What a beautiful world this will be
What a glorious time to be free
Many years ago a guy I know would go up to random women in bars and ask them, “Does this cloth smell lile chloroform?” Way too many actually smelled the cloth. It was his was of weeding out the smart ones.
Reminds me of the guy who would ask random women, "How would you like to have sex?" Nine times out of 10 he'd get smacked. But one time in 10 he'd get laid.
And it sounds like such a happy song, as long as you disregard the lyrics and how far we’ve diverged from those high-flying aspirations.
Only Fagen could make an upbeat song about attending a Nazi Rally (Chained Lightning)
I say that as a major SD fan since the beginning. :-)
Any major dude will tell you.
GREAT ONE!
When your disclaimer amounts to, “this product is a worthless piece of crap”, shouldn’t you just stay in bed with a bottle of Jamieson’s Stout Caskmate and let the world go in about its business?
That guy had a very different approach to meeting women in bars that more normal guys. Entertaining to watch though.
Have you ever read the disclaimer in the EULA for any Microsoft software product? “No claim of suitability for any purpose“, pretty much the same thing.
To be fair, it’s not just Microsoft, pretty much every software vendor disavows any claim that their product is useful.
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.