But instead of looking just at one set of messages it will examine whether a user has asked for contact information from dozens of people or tried to develop multiple deeper and potentially sexual relationship, a process known as grooming.
Companies can set the software to take many defensive steps automatically, including temporarily silencing those who are breaking rules or banning them permanently.
As a result, many threats are eliminated without human intervention and moderators at the company are notified later.
Sites that operate with such software still should have one professional on safety patrol for every 2,000 users online at the same time, said Sacramento-based Metaverse Mod Squad, a moderating service.
Duncan, one of a half-dozen law enforcement officials interviewed who praised Facebook for triggering inquiries, said: 'The manner and speed with which they contacted us gave us the ability to respond as soon as possible.'Facebook is among the many companies that are embracing a combination of new technologies and human monitoring to thwart sex predators.
Such efforts generally start with automated screening for inappropriate language and exchanges of personal information, and extend to using the records of convicted pedophiles' online chats to teach the software what to seek out.
From a business perspective, however, there are powerful reasons not to be so restrictive, starting with teen expectations of more freedom of expression as they age.
If they don't find it on one site, they will somewhere else.
Still, as the Skout case showed, there are several recent trends that have heightened the concerns of child-safety experts: the rise of smartphones, which are harder for parents to monitor; location-oriented services, which are the darling of Net companies seeking more ad revenue from local businesses; and the rapid proliferation in phone and tablet apps, which don't always make clear what data they are using and distributing.
A solid system for defending against online predators requires both oversight by trained employees and intelligent software that not only searches for improper communication but also analyzes patterns of behavior, experts said.
The looser the filters, the more the need for the most sophisticated monitoring tools, like those employed at Facebook and those offered by independent companies such as the UK's Crisp Thinking, which works for Lego, Electronic Arts, and Sony Corp's online entertainment unit, among others.