Many images remained even after initial investigation revealed cartels on Instagram
Facebook is still struggling to identify and remove content created by members of an international drug and human trafficking cartel based in Mexico, including violent images and the group’s recruitment materials, according to a new report based on leaked company documents.
News of images and other content posted by members of the Jalisco New Generation Cartel for the purpose of threatening the group’s enemies and attracting new soldiers was first reported by Le journal de Wall Street le mois dernier, as part of the newspaper’s extensive look into the company while it faces a wide range of criticism regarding violent content and misinformation on its platforms.
Some images were removed after being flagged by the Journal, and a company spokesperson contended that Facebook and its subsidiary Instagram were investing in new AI technologies and other resources to proactively identify and remove such content from the platforms.
Le lundi, pourtant, CNN published its own investigation finding, among other news, that violent content including images of beheadings apparently carried out and posted by the Jalisco New Generation Cartel were still on the platform.
After being contacted by CNN for the story, much of the content flagged by the news organisation was taken down or covered with a graphic content warning. The company did not immediately respond to a request for comment from L'indépendant regarding whether it believes it can keep such content off its platforms going forward.
Facebook has battled its deteriorating image in the news media and other circles as it faces a storm of controversy over a number of issues. The company’s platforms are accused of playing a key role in efforts to organise the attack on the US Capitol in January, and sparking similar division and violence in societies around the world.
Those issues are almost entirely separate from further criticism the company has faced over the issue of Covid-19 misinformation, which is regularly described in news articles and studies as widely proliferating on the platform. The company has towed a fine line of removing major spreaders of misinformation from the platform while refusing to remove some other content widely viewed as misleading or flat-out incorrect.
A slew of documents released to news organisations by a former data scientist for the company, Frances Haugen, ignited a flurry of coverage about the company in recent weeks, including reports of Facebook management ignoring internal concerns of employees who say it is not doing enough to combat violent and hateful content in developing nations.