The victims of human trafficking spreaded on ads are hidden among a number of listings online whilst a
human trafficking organization could be taken down by using and recognizing related ads which result a
new daunting one added for law enforcement. In order to clarify the task to detect abnormality, researchers
at Carnegie Mellon University and McGill University adjust an algorithm to identify errant figures on alike
“The algorithm scans and clusters similarities in text and could help law enforcement direct their investigations and better identify human traffickers and their victims,” said Christos Faloutsos, the Fredkin Professor in Artificial Intelligence in CMU's School of Computer Science, who led the team.
"Our algorithm can put millions of advertisements together and highlight the common parts," Faloutsos said. "If they have a lot of things in common, it's not guaranteed, but it's highly likely that it is something suspicious.
At the IEEE International Conference on Data Engineering (ICDE) week, the algorithm InfoShield called by
the team will be presented about their outcomes. According to the International Labor Organization, as in
forced labor, 24.9 million people are trapped while women and girls trafficked in the commercial sex
industry are around 55% out of those as most ads posted online. For four to six victims could be told and
narrated by one person bringing about alike phrasing and duplication among listings.
"Human trafficking is a dangerous societal problem which is difficult to tackle," lead authors Catalina Vajiac and Meng-Chieh Lee wrote. "By looking for small clusters of ads that contain similar phrasing rather than analyzing standalone ads, we're finding the groups of ads that are most likely to be organized activity, which is a strong signal of (human trafficking).”
For the InfoShield testing, the team proceeded with a set of escort listings which had been already identified
trafficking ads by experts. The result showed that InfoShield had better performance than other algorithms
at identifying the trafficking ads with the 85% precision. Moreover, the InfoShield could flag any escort
listings correctly as human trafficking ads.
Proving their success was tricky as the actual ads contained in the test data are placed by human traffickers.
The team is not allowed to publish examples of the similarities identified or the data set because the
information in the ads is confidential and private in order to protect the victims. This meant that other
researchers could not verify their work.
To fix this, the team required public data sets which mimicked what the algorithm looked for in human
trafficking data; text and the similarities to remedy the limitation. On Twitter, there are loads of text and
similarities created by bots. In similar way, a human trafficking ads and Twitter bots proceed with certain
same pieces of information. Rabbany said that “ In both cases — Twitter bots and human trafficking ads —
the goal is to find organized activity.”
In detecting bots, InfoShield exceeded other advanced algorithms among tweets. Vajiac said this finding was
a surprise. The algorithm instead relied only on the text of the tweets to determine bot or not. Despite
working on algorithms for forecasting and anomaly detection for 30 years, this was the first time Faloutsos
applied one to stopping human trafficking. He and the team hope their work plays a role in helping law
enforcement rescue victims and in reducing human suffering.
For 30 years, Faloutsos had continually worked on algorithms, but this time, he applied one to stop human
trafficking for forecasting and detection. He and the team hope their work could help law enforcement
rescue victims and reduce human agony. The team discuss with experts to gain more understandings about
human trafficking to continue their work and to effort to end it.
The more they learn, the more passion they put toward stopping it.