How Big Tech protects victims’ privacy in the fight against human trafficking

0


Stacker compiled resources from the United Nations to illustrate how novel statistical models are helping shed light on human trafficking.
– MADAREE TOHLALA/AFP // Getty Images

Dom DiFurio

How does one measure a crime as pernicious and brutal as human trafficking without putting the victims behind the numbers at risk?

Stacker compiled resources from the United Nations’ International Organization for Migration, or IOM, to explain how new discoveries in statistical analysis are helping illuminate the dark world of human trafficking.

The U.N. first set out to solve this dilemma by partnering with big technology companies—including Microsoft, Amazon, British Telecommunications, and Salesforce—through its Tech Against Trafficking Accelerator program in 2019. It published its first anonymized data set in 2021 and updated it again with case data from 2022.

For decades, broadly shared human trafficking statistics have been limited to estimates based on tips reported to organizations like the National Center for Missing & Exploited Children, or criminal justice statistics, which experts say paint a conservative picture, highlighting only the cases in which authorities are able to intervene and prosecute perpetrators.

And the data that has been available suffers from another problem—sharing detailed information about the demographics of who engages in and is victimized by the crime risks putting victims in more danger.

In the 2022 update, the U.N.’s partnership with Microsoft and others launched a new way to shed light on the relationships between victims and their captors while preserving victim privacy and the integrity of the underlying data. They’ve done it by combining cutting-edge statistical methods such as synthetic data and differential privacy.

‘Safety in noise’

Synthetic data is data that is artificially generated via algorithms and comes in various forms. It can be based on real data, partially based on real data, or completely artificial. Finance companies like American Express, as well as those in autonomous vehicles and health care, have been working to harness the power of synthetic data for the last several years.

Differential privacy is a method for describing patterns within a group of people without giving away identifiable information about any one individual in the group. Microsoft has used differential privacy algorithms since at least 2017 to protect user privacy while collecting data from their devices.

In the past, agencies releasing case data might redact certain aspects that could be used to identify an individual. This is also called de-identified data. However, each redaction or alteration removes information that could add to understanding trafficking networks.

Other methods of obscuring data to provide privacy include aggregated data and anonymized data, a method that has been found in studies to sometimes distort the underlying data.

The tech coalition’s now-public algorithm ingests sensitive case data from the U.N. and outputs figures that cannot be linked to the individuals represented in it. The authors of the data set refer to this as “safety in noise.” Importantly, however, the data that the updated algorithm outputs preserves the statistical relationships between cases so experts can trace patterns without sacrificing victim safety.

The data set released as a product of the accelerator includes records of 17,000 victims as well as more than 37,000 perpetrators of human trafficking who were active in criminal activity across 123 countries from 2005-2022, according to the IOM.

Specifically, the data set allows more people to analyze the relationships between victims of trafficking and their perpetrators.

Analyzing relationships

The synthetic data created based on the descriptions from trafficking victims assisted by the IOM reveals that forced labor was seven times more prevalent than sexual exploitation from 2021-2022, the most recent time span for which data is available.

Women were the most prevalent victims of the crime last year, and those victims’ traffickers were almost equally as likely to be male or female. The perpetrators in these cases were most often a stranger to the victim, but a significant portion of victims reported that they knew the perpetrator either as a friend or an acquaintance.

Human trafficking is a global problem and often happens when vulnerable populations like migrants are forced into enslavement or sex trafficking. The case data that describe each individual victim or perpetrator’s situation—like that held by immigration agencies globally—has the potential to help unlock insights for the authorities making an effort to catch and prosecute perpetrators and protect victims.

The accelerator’s efforts in conjunction with the U.N. dovetail with the Biden administration’s updated, four-pronged action plan for combatting human trafficking released in 2021. The plan emphasizes the importance of partnerships with the private sector to increase information-sharing that supports efforts to prevent the crime, protect victims, and prosecute those who facilitate the crime.

Story editing by Ashleigh Graf. Copy editing by Paris Close. Photo selection by Clarese Moller.


How Big Tech protects victims’ privacy in the fight against human trafficking
#Big #Tech #protects #victims #privacy #fight #human #trafficking

Leave a Reply

Your email address will not be published. Required fields are marked *