The state of Facebook content moderation
I previously wrote about the few victories that have been won in pushing social media giants, like Facebook, YouTube, and Twitter, to moderate content more responsibly (“Language barriers,” 11/1/21). This is in the context of what the World Health Organization has called an “infodemic” of misinformation, where social media allows outrageous and incorrect news to spread exponentially quickly, leading to confusion, dangerous behaviors, and a mistrust of legitimate health authorities during the pandemic. This is also in the context of clear evidence of how Facebook and other social media sites have allowed political misinformation and hate speech to flourish unchecked, with disastrous consequences for national and world politics.
None of this was recent news, but my focus in November was how different groups have identified a huge hole in Facebook’s so-called efforts to be more socially responsible. Facebook may have deleted fake accounts and ramped up efforts to flag and disable COVID-19 falsities, but many advocacy groups have pointed out how fake news and hate speech in non-English languages, like Italian and Spanish, are less likely to be taken down.
According to an October 2021 report by CNN, Facebook personnel have themselves acknowledged in internal reports that the company appears “ill-equipped” to handle hate speech and misinformation when these are not posted in English. In the November 2021 column, I pointed out how vaccine information persists on Facebook under such non-English keywords as “bakulam,” and how LGBTQ+ hate speech does not get taken down for violating community standards if they’re in different Filipino languages.
This continues to be a pressing matter. An AP article, “Filipinos fall for fake history,” has gone viral on social media as an external perspective of how vulnerable the Filipino electorate is to fake news. It recognizes social media’s role as “critical” and cites how machinery relying on troll armies and communities has heavily skewed our elections, despite efforts at fact-checking and debunking. Doubtless, this strategy succeeds due to heavily coordinated and paid activity. It is reasonable though to suppose that deficiencies in moderating non-English election misinformation plays a role as well.
Meta has recently announced greater efforts in working with the Commission on Elections for better voter education and decreased misinformation. Meta Philippines’ country manager has said that the company spends “a lot of resources to make sure that we balance free speech while ensuring there’s minimal misinformation in the platform.” This is not, however, reflected in everybody’s experience.
In the months since “Language barriers,” my participation in Facebook wars has been to report and flag outright misinformation and spliced videos that have been fact-checked several times over. I have also flagged hate speech, including wishes of violence or rape on the Vice President and her daughters, hoping that Facebook content moderation would recognize such comments as a violation of policies. Many voters like myself, not ones to engage directly in such volatile discussions, have stuck to reporting. However, such reports have almost always ended in disappointment. On top of the comments, I’ve mentioned above failing to get taken down, a comment wishing graphic violence on “lumad” volunteers after Chad Booc’s death was noted as not violating community standards. I was more likely to get a positive response by flagging content as spam rather than flagging it as hate speech or fake news.
Delays in processing reports are understandable due to the high traffic of Filipino users. What is unacceptable is that Facebook often fails to recognize flagged harmful content for what it is. Is it again because of a language barrier that we cannot seem to surmount? Is it because the resources devoted to moderating Filipino content are simply not commensurate to the amount of traffic we drive, as well as the possible consequences of allowing hate speech and fake news to thrive?
Facebook, with more than 80 million Filipino users, still cannot seem to moderate content adequately, disproportionately affecting non-English content and making countries such as ours more vulnerable to violence and misinformation-driven politicking. This is irresponsible to an extreme degree and contributes to the volatility of our political landscape, and unfortunately for us all, is poised to have far-reaching consequences for the 2022 elections.
Your daily dose of fearless views
Subscribe to INQUIRER PLUS to get access to The Philippine Daily Inquirer & other 70+ titles, share up to 5 gadgets, listen to the news, download as early as 4am & share articles on social media. Call 896 6000.