" ယူနီကုတ်နှင့် ဖော်ဂျီ ဖောင့် နှစ်မျိုးစလုံးဖြင့် ဖတ်နိုင်အောင်( ၂၁-၀၂-၂၀၂၂ ) မှစ၍ဖတ်ရှုနိုင်ပါပြီ။ (  Microsoft Chrome ကို အသုံးပြုပါ ) "

Friday, December 10, 2021

Can Facebook be blamed for pogroms against Rohingyas in Myanmar?

THE ECONOMIC
Dec 11th 2021 edition

 
Lawsuits in America and Britain seek billions of dollars in damages


THAT FACEBOOK was used to spread rhetoric that incited carnage in Myanmar is hardly up for debate. According to the lead author of a UN report published in 2018 the firm’s platform played a “determining role” in the violence inflicted on Rohingya Muslims by marauding Buddhists. Facebook acknowledges that it did not do enough to prevent its services from being abused. But whether it is liable for what happened is a trickier question.
Listen to this story

Enjoy more audio and podcasts on iOS or Android.

It may soon be answered. A legal campaign is under way on both sides of the Atlantic. It claims that Facebook, now renamed Meta, should be held liable for allowing users to spread such content during the Rohingya genocide. A letter delivered to Facebook’s London offices on December 6th gave the firm notice of intent to sue it in the High Court. That suit will be on behalf of Rohingyas living everywhere in the world outside America, including Bangladesh, where 1m or so dwell as refugees.

The American complaint, filed on the same day in California, is a class action on behalf of Rohingyas living in America. It is seeking “at least” $150bn in compensation for “wrongful death, personal injury, pain and suffering, emotional distress and loss of property”. Although American internet companies typically are shielded from liability for content that is disseminated through their platforms, the suit argues that the court must apply Burmese law for harms done in Myanmar. American courts can theoretically apply foreign laws in this way, though there is little precedent for it.

Meta did not comment on the lawsuit when asked, but said that it was “appalled by the crimes committed against the Rohingya people”. It added that it has improved its capacity to moderate Burmese content.

The allegations fall into two categories. The first is that since 2010 Facebook failed actively and effectively to moderate content on its network that was contributing to the incitement of genocide in Myanmar, despite being aware of what was happening. The second is that Facebook’s own content-recommendation algorithms amplified the spread of this content. (Meta has been approached for comment.)

No precedent exists for such a case, at least when it comes to social-media companies. One distant parallel is with Radio Mille Collines, a Rwandan radio station that was instrumental in inciting the Rwandan genocide of 1994, in which perhaps 500,000 people, mostly Tutsis, were killed. Some of those who ran the station were convicted of incitement to genocide. The difference is that that was Radio Mille Collines’s main purpose. (Its former chairman is also accused of financing the import of machetes.) International courts went after those who urged the killing, not the manufacturers of the radio equipment.

The current lawsuits argue that Facebook is both manufacturer and, to some extent, messenger: its algorithms decide what people see. Whether and how the firm is liable for what its algorithms do will now be tested. ■

This article appeared in the Asia section of the print edition under the headline "Accounting for algorithms"
 
Link : Here

No comments:

Post a Comment

/* PAGINATION CODE STARTS- RONNIE */ /* PAGINATION CODE ENDS- RONNIE */