Global social media giant Facebook will need to increase its efforts to combat hate speech against the persecuted Rohingya minority, says the company’s top policy official.
Since the outbreak of Rohingya refugee crisis in August 2017, U.N. investigators and human rights groups have criticized Facebook harshly over alleged lack of action against accounts that used its platform to encourage violence against the disfranchised Rohingya Muslims.
Speaking at the New America Foundation, Facebook head of Global Policy Management Monika Bickert said the company has made improvements to address the problem in Myanmar, but more action is needed to prevent hate-filled content against the Rohingya population.
“I think one of the biggest things we have to do is to improve our relationships with civil society groups on the ground,” Bickert said. But she admitted that “it is a complicated landscape, and there is a lot more we can do there.”
The Rohingya represent the largest percentage of Muslims in Myanmar, with the majority living in the west coast state of Rakhine.Their population was estimated to be around one million at the beginning of 2017, but the United Nations says nearly 900,000 have fled into neighboring Bangladesh due to a brutal crackdown by Myanmar’s army, which claimed it was targeting terrorists.
Rights activists inside the country and out have accused Facebook of amplifying ethnic violence in the country by not taking swift action against extremist Buddhists who use social media to spread anti-Rohingya content, ranging from hate posts to false news articles, photos, and videos.
The U.N. Myanmar investigators last March said they found that social media outlets, particularly Facebook, played a “determining role” in spreading hate speech in the country.
“I’m afraid that Facebook has now turned into a beast, and not what it originally intended,” U.N. Special Rapporteur on human rights in Myanmar Yanghee Lee told reporters.
Facebook officials say they do not tolerate hate speech and they will suspend and sometimes remove accounts that consistently shares content promoting hate, despite the difficulties.
The company last week reportedly blacklisted Myanmar’s hardline Buddhist group Patriotic Association of Myanmar or Ma Ba Tha and two prominent monks for promoting hatred towards the Rohingya.
Last month, Facebook released a 86-page report, for the first time giving information on the content it removed around the world. Those included 2.5 million hate posts that were removed from the platform through January-March 2018, 38 percent of them were flagged by artificial intelligence (AI).
During the announcement on May 15, Facebook vice-president of product management Guy Rosen said the company had a hard time policing hate speech, mainly because the AI was unable to understand historical context and cultural nuances.
“Our technology still doesn’t work that well and so it needs to be checked by our review teams,” said Rosen.
Facebook officials say they have also hired people to review content in language in order to compensate for AI shortcomings.
In response to a letter from Myanmar’s six civil society organizations, Facebook chief executive Mark Zuckerberg said the social media giant had added “dozens” of Burmese language content reviewers to more efficiently detect hate speech.
In an email response obtained by the New York Times, Zuckerberg said they also had “increased the number of people across the company on Myanmar-related issues,” including a team to build tools to help stop the violence there.
According to Facebook’s Bickert, the number of language reviewers also needs to be ramped up for more substantial results.
“What we have seen over the years is that you need to have special consideration for places where there is speech-related issues, whether it’s because there is violence on the ground, or there is an influx of migrants, and there is a lot of hate speech,” she said. “There are certain areas where there is a disproportionate need for language review to be done around the clock, and we are seeing that in Myanmar.”
Source: Voice of America