Posted on 01/04/2025 8:48:51 AM PST by BenLurkin
A two-hour conversation with an artificial intelligence (AI) model is all it takes to make an accurate replica of someone's personality, researchers have discovered.
In a new study published Nov. 15 to the preprint database arXiv, researchers from Google and Stanford University created "simulation agents" — essentially, AI replicas — of 1,052 individuals based on two-hour interviews with each participant. These interviews were used to train a generative AI model designed to mimic human behavior.
...
To create the simulation agents, the researchers conducted in-depth interviews that covered participants' life stories, values and opinions on societal issues. This enabled the AI to capture nuances that typical surveys or demographic data might miss, the researchers explained. Most importantly, the structure of these interviews gave researchers the freedom to highlight what they found most important to them personally.
Although the AI agents closely mirrored their human counterparts in many areas, their accuracy varied across tasks. They performed particularly well in replicating responses to personality surveys and determining social attitudes but were less accurate in predicting behaviors in interactive games involving economic decision-making. The researchers explained that AI typically struggles with tasks that involve social dynamics and contextual nuance.
They also acknowledged the potential for the technology to be abused. AI and "deepfake" technologies are already being used by malicious actors to deceive, impersonate, abuse and manipulate other people online. Simulation agents can also be misused, the researchers said.
However, they said the technology could let us study aspects of human behavior in ways that were previously impractical, by providing a highly controlled test environment without the ethical, logistical or interpersonal challenges of working with humans.
(Excerpt) Read more at livescience.com ...
Also, and BTW:
One Crow is called a Crow.
Many Crows are called a Murder.
One Lazamataz is called a Lazamataz.
Many Lazamataz’s are called a Hit.
(example used in a sentence: “Hey! Look at that Hit of Lazamataz’s. Shouldn’t we hide our wives?”
😂
Ridiculous. A human personality is an unquantifiable entity.
Brings to mind another Star Trek trick to deal with the AI replicating you, the episode, "What are Little Girls Made of?". Kirk, while being scanned for duplication, used angry insulting thoughts about Spock so the memory engrams would be inaccurate.
just 80%? PHEW! At least it won’t be a full blown 100% idiot like me
bookmark
Jordan Peterson has a questionnaire that can do the same
bttt
I always thought an army of laz’z would be a phalanx.
They would be hard pressed to replicate MY personality. I don’t have one...
I think when AI hooks up with all the ‘mysterious’ drones, it will be time for us humans to break out our cotton picking gloves and Negro Spirituals...
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.