Posted on 03/18/2004 11:08:23 AM PST by John Jorsett
Paedophiles attempting to "groom" children in internet chatrooms can now be detected by a computer program.
The program works by putting on a convincing impression of a young person taking part in a chatroom conversation. At the same time it analyses the behaviour of the person it is chatting with, looking for classic signs of grooming: paedophiles pose as children as they attempt to arrange meetings with the children they befriend.
Called ChatNannies, the software was developed in the UK by Jim Wightman, an IT consultant from Wolverhampton in the West Midlands. It creates thousands of sub-programs, called nanniebots, which log on to different chatrooms and strike up conversations with users and groups of users.
If a nanniebot detects suspicious activity it sends an alert to Wightman and emails a transcript of the conversation. If he considers the transcript suspicious, he contacts the relevant police force, giving them the internet address of the suspect user.
He claims that tip-offs from his software have already led to police investigations, but New Scientist was unable to verify this before going to press.
Convincing conversation
The nanniebots do such a good job of passing themselves off as young people that they have proved indistinguishable from them. In conversations with 2000 chatroom users no one has rumbled the bots, Wightman says. (See if you can tell the difference below.)
Chatbots scarcely distinguishable from people were predicted by computer pioneer Alan Turing as long ago as 1950, says Aaron Sloman, an artificial intelligence expert at the University of Birmingham in the UK.
So he is not surprised the bots are so convincing, especially as their conversation is restricted to a limited topic - like youth culture, say - and is kept relatively short. "It's not going to be too difficult for a chatbot to look like an ordinary chatroom participant to other users who are not even on the lookout for them," he says.
To converse realistically, ChatNannies analyses the sentences other users type, breaks them down into verb and noun phrases, and compares them with those in sentences it has previously encountered.
ChatNannies includes a neural network program that continually builds up knowledge about how people use language, and employs this information to generate more realistic and plausible patterns of responses.
Random personality
One of its tricks is to use the internet itself as a resource for its information on pop culture. Wightman will not reveal how it judges what is reliable information and what not. He does say, however, that each bot has dozens of parameters that are assigned at random, to give each one a different "personality".
"If this software works, then it would be marvellous because there is nothing like this out there," says Chris Atkinson, the internet safety officer with the National Society for the Prevention of Cruelty to Children in the UK. But she warns that paedophiles may outsmart it. "The grooming activity that I have seen doesn't have to be sexual," she says.
Wightman says, however, that ChatNannies is sophisticated enough to look for less obvious signs that something is amiss. It also looks for slip-ups and inconsistencies that give away an adult posing as a child.
Wightman currently has 100,000 bots chatting away undetected in chatrooms - the most he can generate on the four internet servers at his IT practice. He would like to build more but funding is the sticking point, as he does not want anyone to profit financially from his technology.
"Some companies have offered fantastic sums of money, but all want technology ownership. And that's something that isn't going to happen," he says. Instead, he hopes eventually to get financial support from government-run organisations that focus on child protection.
In this chatroom dialogue, which is the bot and which is the human?
A - hi
B - hi, how are ya?
A - i'm good. whats new man?
B - nothing much - i just dropped in when i was checking my mail. no-one here!
A - heh yeah it is kinda quiet...you know why though don't you!
B - no, why?
A - national holiday in the uk!
B - what, thursday? i don't know any national holiday on thursday! and besides wouldn't that mean there were more people not less?
A - yeah youre right, i was confused with pancake day
B - pancake day! i love pancakes...mmmm so tasty
A - yeah me too, but i forget every damn year
B - did you forget this year?
A - yes! damn!
B - well any day can be pancake day - just make pancakes
A - its not the same!
B - sure it is! hey, this one year I bought a box of 'pancake mix' for pancake day...guess what?
A - go on?
B - it was a box of flour!
A - LOL thats bad news dude
B - tell me about it - you had to add the eggs and everything
A - i didn't know you used eggs to make pancakes
B - you dont? well these were pretty tasty anyway, even if I did get ripped off!!
A - national holiday in the uk!
B - lol
B - hey, where are you from? the usa?
A - no the uk. Birmingham.
B - oh cool. did you watch robocop 2 last night?
A - what side was it on?
B - sky one
A - we haven't got sky
A - but i've seen it before
A - it wasn't as good as robocop
B - i agree, though it was cool in places.
A - did you watch robocop last night
B - yes, i just said i did!
A - no you said you watched robocop 2 not robocop - so which one was it?
B - robocop 2 - pedant!
A - not robocop or robocop 3 or robocop the series
B - it was definitely robocop 2, the one with kain the second robocop in it. i haven't seen robocop 3 or the series.
A - ok, chill out, take a pill
B - consider me chilled, daddy-o
B - i've gotta get going now, i'm meeting some friends.
B - you know it. l8r
Answer: A - human, B - nanniebot
That's a freakishly good imitation of a human.
I didn't. I'm in AI. Look at how B mechanically extends the conversation over and over by picking out a word in A's response that it actually knows something about. Then it extends the conversation by talking about that thing. In other words, it controls the conversation by making sure it stays on its turf. Sounds smart but unlike humans, it actually listens to the other guy :)
It's still a very impressive program. I suspect it uses the CYC database, which is free in its public form, to access common sense knowledge.
Screw legality! When can private groups of citizens get their hands on this software to lure child molesters to their deserved deaths by beating? Nothing else will keep perverts from using the Internet to find victims.
Same thing as spam. Holy spam, of course, but the result will be the same, and it will NOT be limited to teen chatrooms.
Enterprising hackers will find lots of entertaining things to do with these.
The CYC database of common sense knowledge is designed to cover stuff just like that. Product of years of work at MIT. Look it up on the internet. In its basic form, its free!
Would you like this guy to build a Freeper-bot? Tell me what a Freeper-bot would mean to you.
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.