Following years of allegations that it ignored online atrocities that stoked real-world violence in nations like India and Myanmar, Facebook owner Meta (META.O) on Thursday published its first annual human rights report.
The report, which covers due diligence completed in 2020 and 2021, contains a summary of a contentious study on India’s human rights impact that Meta hired the legal team at Foley Hoag to complete.
Amnesty International and Human Rights Watch, two human rights organisations, have demanded the full publication of the India report and have accused Meta of holding it up in a joint letter written in January.
The law firm’s findings regarding the possibility for “salient human rights risks” associated with Meta’s platforms, including “advocacy of hatred that incites hostility, discrimination, or violence,” were noted in the summary provided by Meta.
The assessment did not go into “accusations of bias in content moderation,” it was said.
Ratik Asokan, a participant in the evaluation and later the organiser of the joint letter from India Civil Watch International, told Reuters that he felt the summary was an attempt by Meta to “whitewash” the firm’s conclusions.
It’s the most convincing proof you can find that they find the facts in that study to be highly unsettling, he claimed. Release the executive summary, at the very least, to let us to see what the impartial legal firm has to say.
The summary has also been criticised by Human Rights Watch researcher Deborah Brown as being “selective” and “bringing us no closer” to comprehending the company’s contribution to the spread of hate speech in India or the commitments it would make to solve the problem.
Rights organisations have long expressed concern over anti-Muslim hate speech escalating tensions in India, which is Meta’s largest market worldwide according to a number of users.
Following a Wall Street Journal story that she resisted applying the company’s policies to Hindu nationalist figures warned internally for inciting violence, Meta’s chief public policy executive in India resigned in 2020.
Meta stated in its report that it was looking into the India recommendations but did not make the same commitment to their implementation as it had made with other rights assessments.
When asked about the distinction, Miranda Sissons, director of Meta-Human Rights, cited UN policies that forbade risks to “affected stakeholders, staff, or to legitimate requirements of commercial secrecy.”
According to Sissons, “a range of circumstances, including security considerations, can influence the reporting’s format.”
Sissons, who joined Meta in 2019, claimed that her team currently numbers eight, while a total of one hundred other individuals work on human rights with related teams.
The study described her team’s work on Meta’s COVID-19 response and Ray-Ban Storie’s smart spectacles, which included flagging potential privacy hazards and consequences on vulnerable populations, in addition to country-level analyses.