For a number of years now, police forces around the world have enlisted officers to pose as kids in online chat rooms, in an attempt to draw out pedophiles and track them down. Researchers at Spain’s University of Deusto are now hoping to free those cops up for other duties, and to catch more offenders, via a chatbot that they’ve created. Its name is Negobot, and it plays the part of a 14 year-old girl.
“Chatbots tend to be very predictable. Their behavior and interest in a conversation are flat, which is a problem when attempting to detect untrustworthy targets like pedophiles” says Carlos Laorden, who helped develop the program. “What is new about Negobot is that it employs game theory to maintain a much more realistic conversation.”
Game theory, putting it simply, involves strategic decision-making performed in order to reach a goal. In the case of Negobot, this is achieved through the use of seven different “conversational agents” (or levels) which dictate and change the virtual girl’s behavior in response to the suspected pedophile’s actions.
When Negobot first enters into a chat room conversation, it starts at level 0, in which it’s neutral. If the person doesn’t appear to be interested in talking to the chatbot, it gets more insistent about having a conversation, by introducing topics that will hopefully capture their attention. In doing so, it proceeds through levels -1 to -3.
It’s also possible, of course, that the person could be very interested in chatting Negobot up. Should they start exhibiting “suspicious behavior,” such as not caring about the girl’s age or asking her for personal information, the program enters into levels 1 to 3. Operating at these levels, it attempts to obtain the person’s phone number, email address, social network profile, or anything else that could be used to physically locate them.
In order to appear more human-like, Negobot remembers facts about specific people with whom it’s chatted, that it can bring up in subsequent conversations. It also sometimes takes the lead in the conversation (chatbots often only react), varies the amount of time that it takes to respond, plus it throws in some slang and poor writing. That said, it still doesn’t understand subjective uses of language, such as irony.
The university has “field tested” Negobot on Google’s chat service, and has entered into a collaborative agreement with the Basque Country police force, which is interested in implementing the technology.
Source: Plataforma SINC (Spanish)