Feature image via SMEX: Who controls what we see online?
Who controls what we see online? SMEX screened the documentary “The Cleaners” and hosted a discussion yesterday, Sunday 18 November, before the start of regional unconference Bread&Net.
“The Cleaners” sheds light on the hidden world of content moderators – thousands of workers who do the largely unknown role of “cleaning up” the web, deciding what stays and what gets deleted on content sharing platforms like Facebook and Youtube. These decisions can be incredibly complex, with lots of grey areas such as between documenting a war and showing gratuitous violence, and whether a nude painting is pornography or art.
Jillian York, who consulted on the film, and Nada Akl, a researcher and former content moderator, discussed the opaqueness of current moderation processes and underscored the impact that content moderation has on how we see the world. For instance, if a famous Pulitzer-prize winning image from the Vietnam War was deleted for not meeting community guidelines on child nudity, what does this mean for the history-making images of the present? What will be the defining images documenting the war in Syria? And who is taking these decisions?
Content moderators such as those in “The Cleaners” are a globalized workforce of third-party contractors, which limits the liability of tech companies. Because of their contracts, content moderators can’t speak freely about what they do. With daily targets to meet, they have to make decisions in seconds, often without an understanding of the context.
One content moderator said that he looked at an incredible 25,000 images a day. In the documentary, content moderators believed the work they were doing was important. But seeing the worst of humanity can have very damaging personal consequences on their mental health and wellbeing.
Tragically, the work has led some content moderators to commit suicide. When one content moderator was distressed by child pornography, her supervisor reminded her that she had signed a contract and had to do her job. And what happens when social media content in Myanmar stirs up hatred and violence against Rohingya people? Content moderators are burdened not only by the horrific images they see, but also the responsibility of their individual decisions to ignore or delete content.
While individual workers in outsourcing hubs like Manila are the frontline decision-makers, they base their decisions on guidelines. Each technology company has its own frequently-changing community guidelines, created by policy makers at headquarters. As Jillian York pointed out, at Facebook the higher-level decisions on guidelines are taken by a few people who all happen to be Harvard law school graduates – hardly diverse or representative of Facebook users, and not experts in the kinds of content that is being moderated.
With over 2 billion users, Facebook is larger than any nation state, yet an elite few decide what we see online and shape our world. “Companies shouldn’t have this much power,” Jillian noted. As users we have little control over what gets deleted, and little recourse to challenge a decision made by moderators. Both speakers instead called for content moderation to be a more transparent and open process.
So what can we do to change the current system? Given the toll on content moderators, we can demand technology companies treat their employees better. We can also hold companies to account and fight for greater transparency in content moderation. For instance, last week SMEX and other digital rights organizations globally signed an open letter to Mark Zuckerberg calling for an appeal mechanism on content decisions for all Facebook users, and Facebook subsequently announced an independent appeals body in 2019. Regionally, we can work with entrepreneurs and tech startups, so that they consider human rights principles from the outset.
Sacha Robehmed is a researcher and writer based in Beirut.