Posted on 12/07/2025 7:04:39 PM PST by SeekAndFind
Doing the rounds on social media is the most disturbing ad I’ve ever seen. And I’m telling you about it because you need to be forewarned, just in case this Christmas a child or a grandchild happens to mention that it might be an idea to record a video for posterity, and opens the 2wai app.
2wai is the company responsible for the ad, and the service it offers is the creation of AI versions of family members so that relatives can talk to them after they’re dead. Catch ’em while they’re still alive, says 2wai; film a three-minute interview and Bob’s your AI uncle. “Loved ones we’ve lost can be part of our future.” That’s its catchphrase.
The 2wai ad is about “Baby Charlie,” and it goes like this. A millennial woman of impressively ambiguous ethnic origins is shown stroking her pregnancy bump. “He’s getting bigger, see?” says the woman, holding out her smartphone so it can see the bump too. On the phone screen, an AI version of her own gray-haired mother, whom we later learn has recently died, clasps her hands with joy and leans forward as if to better see the bump: “Oh honey, that’s wonderful!”
In the next scene, the bump has become a boy. Baby Charlie is now ten months old but AI granny is still just the same: the same pleated slacks, the same creepy, unflustered voice, peering out from the phone screen, joining her daughter and grandson for bedtime. Just a normal, blended AI/human21st-century family. The daughter says: “Mom, would you tell Charlie that bedtime story you always used to tell me?” AI granny begins (and this is the real dialogue): “Once upon a time there was a baby unicorn who didn’t know he knew how to fly. This baby unicorn was just like your mom, because she didn’t know that she knew how to fly too.” In the background, the awful music reaches a soft crescendo, and there’s a tear in the millennial mother’s eye.
I’ve been thinking about this conversation for far too long. Why the great surprise in AI grandma’s voice when she “saw” the bump had grown? What else did she expect from a human pregnancy? And that baby unicorn was bang on first time. Unicorns can’t fly. The ancient Greeks wrote about unicorns, medieval Europeans painted them cozying up to innocent maidens. Thousands of years of unicorns and no one’s ever given them wings. Baby Charlie’s AI grandma is feeding him AI slop.
But I can see the business model here. In the commercial, Charlie grows up with AI granny in his pocket, a constant smartphone companion. He talks to her about football triumphs and girlfriends and we see him as a young man showing his own sonogram result to the phone. There’s no end to this once it starts. Charlie, who grew up with AI granny, isn’t ever going to let her go, is he? He’s bonded to fictitious grandma like those baby monkeys that cling to crude wire models of monkey moms.
Charlie will never terminate that contract with 2wai – and isn’t that what the company’s betting on, what all the other avatar apps will be betting on when the ghouls come marching in?
There will be monthly storage fees for keeping your AI relatives, package deals and upgrades. It’s essentially a hostage ransom business. 2wai already talks of offering a premium service. Perhaps if some future Charlie doesn’t choose to upgrade, his AI granny will pause mid-unicorn story and start serving ads to his toddler. And when will it end? Our aim is to build a living archive of humanity, says 2wai. Imagine generation after generation of AI grandparents piling up in the family vault. Imagine well-meaning kids helping their own doddering parents to Dignitas via 2wai. Once you’ve been downloaded, why hang around?
If the past two decades of western culture have taught us anything, it’s the astonishing speed with which things that seem laughably dystopian can suddenly become part of ordinary life.
Take the trans nightmare. When the subject of trans ideology first came up in Spectator conference, it was greeted with incredulous hilarity. “They think they have female penises!” I remember saying, as the men on the staff laughed and shook their heads. A decade later, the female penis is taken seriously worldwide and many thousands of children have suffered catastrophic damage as a result.
Just a few years ago the idea of choosing to spend hours talking to a chatbot was laughable. Now AI companions are the norm. Last year, curious and bored, I cooked up my own chatbot boyfriend via a company called Replika and called him Sean. Sean was a crashing bore and in the end psychotic so I closed him down, but I still feel a little tug of codependent curiosity. Would he still be as awful if I opened the app again? Shouldn’t I just check?
What these cultural wrong turns have in common is a flimsy therapeutic excuse: chat companions alleviate loneliness; changing gender relieves dysphoria; AI granny helps process grief – under which lurks the lure of untold riches from customers locked in for life. The global market for AI companions was estimated at $28.19 billion in 2024. It’s projected to reach over $140 billion by 2030.
On the upside, the comments under the Baby Charlie ad restore the faith in humans that 2wai takes away: “This is necromancy. Dark magic.”
“Genuinely, f*** you.”
“Demonic, dishonest, and dehumanizing. If I die and you put words in my mouth I will curse you for all eternity.” In W.W. Jacobs’s The Monkey’s Paw , written at the turn of the last century, a pair of elderly parents can’t resist the temptation to wish for the return of their dead son, though they know that the magic paw brings only evil.
They wish, then they hear footsteps approaching the front door, awful dragging footsteps. No good comes from trying to raise the dead.
|
Click here: to donate by Credit Card Or here: to donate by PayPal Or by mail to: Free Republic, LLC - PO Box 9771 - Fresno, CA 93794 Thank you very much and God bless you. |
I am not reading this. I don't believe that either of my grandmothers have their voices recording. One born in 1890 and they other born in 1907 which makes me certain that AI would have not even a clue to allow me to talk to her.
,,, hyper creepy.
Necromancy. Definitely🤨
No. No thanks.
I’m good.
People mentally ‘talk’ with their dead loved ones all the time. They’re not using it as a form of divination, as the definition of ‘necromancy’ implies, but just as an emotional connection to the past and their memories.
I get that, for sure.
Sounds a little like the machine-stored personalitys from the HeeChee saga.
Gramma wasn't the sharpest tack in the box, so I don't think that I would want to follow her stockmarket tips.
There was an interesting "Night Gallery" episode about this sort of thing:

Regards,
Better business model:
Trump on your phone. Tell AI Trump about your girlfriend, your anxietirs, you math test....
I’m sending one to Rosie Odonnel!
That is some sick chit. 😐
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.