" ယူနီကုတ်နှင့် ဖော်ဂျီ ဖောင့် နှစ်မျိုးစလုံးဖြင့် ဖတ်နိုင်အောင်( ၂၁-၀၂-၂၀၂၂ ) မှစ၍ဖတ်ရှုနိုင်ပါပြီ။ (  Microsoft Chrome ကို အသုံးပြုပါ ) "

Sunday, October 2, 2022

Meta Should Pay Reparations to Rohingya Refugees, Rights Group Says

THE I DIPLOMAT
September 30, 2022


Amnesty International claims that Facebook was aware that its algorithms were amplifying harmful anti-Rohingya hate speech in Myanmar, but still did nothing to stop it.

Facebook’s parent company Meta should pay reparations to Rohingya communities who were driven out of western Myanmar in 2017, given the role that it played in enabling the campaign of ethnic cleansing, the human rights group Amnesty International said in a report published yesterday.

In a new report published yesterday, Amnesty claims that Facebook’s “dangerous algorithms and reckless pursuit of profit… substantially contributed to the atrocities perpetrated by the Myanmar military against the Rohingya people in 2017.”

In August of that year, the Myanmar military launched a “clearance operation” against the Rohingya communities of northern Rakhine State in the country’s west, which drove more than 700,000 Rohingya civilians across the border into Bangladesh. During the assaults, hundreds of villages were burned to the ground, civilians were shot, and hundreds, possibly thousands, of women and girls were raped.

Facebook’s role in enabling this ethnic cleansing – and possible genocide – of the Rohingya has long been recognized. In March 2018, the U.N. Independent International Fact-Finding Mission on Myanmar reported that that social media platforms, particularly Facebook, had played a “determining role” in the violence against the Rohingya, and had “substantively contributed to the level of acrimony and dissension and conflict” in the country ahead of their expulsion. Later that year, the New York Times conducted its own investigation into Facebook’s role in facilitating the violence, which concluded that Myanmar military personnel had “turned the social network into a tool for ethnic cleansing.”

Facebook has admitted that its platforms were used to streamline hate speech and fuel sectarian and ethnic conflict in Myanmar. Since 2017, it has removed the pages of senior military figures, including current junta leader Senior Gen. Min Aung Hlaing. Then, in late 2018, it published the findings of a report in which it admitted that “we weren’t doing enough to help prevent our platform from being used to foment division and incite offline violence. We agree that we can and should do more.”

As suggested above, the Amnesty report claims that Meta’s responsibility strays from the realm of omission to that of commission – and that the amplification of hate speech by the network’s algorithms was a feature, not a bug. The giant tech firm “knew or should have known that Facebook’s algorithmic systems were supercharging the spread of harmful anti-Rohingya content in Myanmar,” but “still failed to act,” the report claims.

Meta’s role was “not merely that of a passive and neutral platform that responded inadequately in the face of an unprecedented crisis,” the report argues. “In reality, Meta’s content-shaping algorithms proactively amplified and promoted content on the Facebook platform which incited violence, hatred, and discrimination against the Rohingya.”

Moreover, it claims that the risks should have been clear to the company long before the 2017 atrocities perpetrated against the Rohingya. As evidence, it cites internal documents from the so-called “Facebook Papers,” which were leaked from the firm by whistleblower Frances Haugen last year. These seem to demonstrate a clear awareness of both the algorithms’ tendency to amplify hate speech, and the inadequacy of content moderation efforts to halt its spread. As one former Meta employee outlined in an internal document dated August 2019,

“We have evidence from a variety of sources that hate speech, divisive political speech, and misinformation on Facebook and the family of apps are affecting societies around the world. We also have compelling evidence that our core product mechanics, such as virality, recommendations, and optimizing for engagement, are a significant part of why these types of speech flourish on the platform.”

Another internal document from July 2019 stated that “we only take action against approximately 2 percent of the hate speech on the platform.”

At the same time, throughout this period activists from Myanmar and abroad were attempting to warn Facebook/Meta about the impacts that the network’s rapid growth were having in Myanmar, and its dangerous lack of Burmese-language content moderators. (Even now, the London-based advocacy group Global Witness claims, Meta is still failing to detect hate speech and calls to violence against the Rohingya.)

Even then, Meta’s efforts to respond to these concerns were minimal and in some cases may actually have made things worse. In a dark irony, the report described Facebook’s support for an anti-hate initiative known as Panzagar, which means “flower speech” in Burmese, which created a sticker pack for Facebook users to post in response to content which advocated violence or discrimination. However, the report noted that Facebook’s algorithms “interpreted the use of these stickers as a sign that people were enjoying a post and began promoting them. Instead of diminishing the number of people who saw a post advocating hatred, the stickers actually made the posts more visible.” Whether such feel-good initiatives were up to the task of beating back the tide of misinformation and hate speech is also open to question.

For all of these reasons, Amnesty said that the company has the responsibility to compensate financially the nearly 1 million people who are now eking out an existence in the sprawling refugee camps around Cox’s Bazar in southeastern Bangladesh.

“Meta must be held to account,” Amnesty’s Secretary General Agnès Callamard said in a statement accompanying the report’s release. “The company now has responsibility to provide reparations to all those who suffered the violent consequences of their reckless actions.”

Of course, it is hard to see Meta offering financial compensation to the hundreds of thousands of Rohingya refugees stranded in Bangladesh. To do so would be to admit publicly that its entire business model – essentially, the conversion of human attention into ad dollars – is precisely to blame for the inflammation of social tensions in many countries. At a time when Facebook’s overweening influence is under more scrutiny than ever before, any such admission is highly unlikely.

Authors

Link : Here

No comments:

Post a Comment

/* PAGINATION CODE STARTS- RONNIE */ /* PAGINATION CODE ENDS- RONNIE */