Facebook is faced with yet another content-related scandal, after The Times newspaper reported that people traffickers and slave traders are using its platform to broadcast videos of migrants being tortured to try to extort money from their families.
According to the newspaper’s report, footage showing Libyan gangmasters threatening the lives of migrants had remained on the social network for months.
It reports that harrowing footage shared on Facebook showed emaciated and injured migrants, mostly Somalis and Ethiopians, huddled in a concrete cellar describing the abuse they have suffered and pleading for their lives.
The Times quotes a United Nations migration agency criticizing the company for allowing people traffickers to use its site to “advertise their services, entice vulnerable people on the move and then exploit them and their families”.
Mohammed Abdiker, of the International Organisation for Migration, said: “It is irresponsible for tech companies like Facebook to ignore this issue. It’s hard to believe that the tech giants cannot put some real effort into stopping these smugglers from using their platforms for racketeering.”
In one instance the newspaper says a video that had been posted on June 9 was still on the site until yesterday.
Facebook confirmed to TechCrunch it had removed some of the content after the Times reported it. In a statement, a spokesperson told us: “Offering services to take part in, support or promote people smuggling on Facebook, violates our Community Standards.”
Disrupt 2026: The tech ecosystem, all in one room
Your next round. Your next hire. Your next breakout opportunity. Find it at TechCrunch Disrupt 2026, where 10,000+ founders, investors, and tech leaders gather for three days of 250+ tactical sessions, powerful introductions, and market-defining innovation. Register now to save up to $400.
Save up to $300 or 30% to TechCrunch Founder Summit
1,000+ founders and investors come together at TechCrunch Founder Summit 2026 for a full day focused on growth, execution, and real-world scaling. Learn from founders and investors who have shaped the industry. Connect with peers navigating similar growth stages. Walk away with tactics you can apply immediately
Offer ends March 13.
In an earlier statement to the Times, it added: “People smuggling is illegal and any posts, pages or groups that co-ordinate this activity are not allowed on Facebook. We encourage people to keep using our reporting tools to flag this kind of behaviour so it can be reviewed and swiftly removed by our global team of experts, who work with law enforcement agencies around the world.”
A series of public outcries over content moderation has cranked up the political pressure on Facebook in recent times — ranging from suicides and murders being broadcast via its live streaming service, to extremist propaganda and child abuse content.
It also faces ongoing pressure to speed up hate speech take downs — especially in Europe, where legislators are eyeing fines to enforce action.
In May, faced with rising political pressure on multiple fronts, Facebook announced it would be adding 3,000 additional staff to its content moderation team — expanding it to 7,500. Although for a platform with two billion users globally that’s clearly a drop in the ocean.
CEO Mark Zuckerberg has also said it’s hope is that developments in AI technology will enable it to automate content moderation at scale in time. Though he also warned such a scenario is likely years out — meanwhile the scandals keep coming. Including, now, an accusation that it is not doing enough to stop people traffickers utilizing its platform to profit from human misery.
The company argues it seeks to balance raising awareness of controversial issues, including from war zones, with taking down content that may be disturbing to users.
A spokesman told us that in this instance, for example, while it has removed abuse videos reported to it by the Times, it has not removed a video report by a Somali journalist covering people smuggling, although it has added a warning to flag the disturbing nature of the content.
“We also believe it is important that Facebook continues to be a place where people can raise awareness of important, and sometimes controversial issues. This specific video was posted to condemn the content, so we would not consider it a violation of our policies. However, the content is alarming, and we have marked the video as disturbing. This means there will be a warning screen and the video’s distribution will be limited to those aged 18 and over,” the spokesperson added.
