Examining Facebook’s Human Rights Policy

We analyse Facebook's recently announced Human Rights Policy. It is too little, too late, and we call on it to improve its practices openly and transparently for Indian users of its platforms.

30 March, 2021
8 min read

Tl;dr

Facebook Inc.’s largest user base is in India because of which it wields immense power in our daily lives. On an objective analysis of it’s recently announced Human Rights Policy we cautiously note positive measures such as support for end-to-end encryption but also provide a larger criticism. It is too little, too late and much remains to be operationalised. Moreover, Facebook Inc. continues to apply discriminatory standards for human rights enforcement in the United States and India and the concerns we raised remain unaddressed. For these reasons we call on it to improve it’s practices, openly and transparently for Indian users of its platforms including Whatsapp and Instagram.

Background

On March 16, 2021, Facebook unveiled its Human Rights Policy which is applicable throughout the world and is relevant for India. The policy for the first time states Facebook’s commitment to the United Nations Guiding Principles on Businesses and Human Rights (UNGPs) in one central repository, applying it to all its products, in all the geographies of its operation. This is done by cataloguing its various policies, guidelines, principles and practices in one place. We believe that social media corporations such as Facebook, which perform a public function, should be held accountable, and we welcome the policy as a statement of intent but Facebook Inc.’s practices in India have been contradictory to human rights standards for a fairly long time.

What is a Human Rights Policy?

Silicon valley corporations like Facebook impact individual rights on a daily basis in a highly networked and digital world. This is why UNGPs calls upon such ‘business enterprises’ to maintain a minimum standard of human rights across jurisdictions. A number of Silicon Valley-based entities and members of ‘Big Tech’ also have such policies in place (for example Apple, Google, Microsoft and Amazon).

Despite its market leadership, Facebook has been a laggard and has belatedly developed and published a human rights policy. This delay is inexcusable given documented cases of harmful social impacts due to Facebook’s conduct in jurisdictions such as India. For context, the 2020 RDR Index released by Ranking Digital Rights, evaluated 26 most powerful digital platforms and telecommunications companies on their practices affecting freedom of speech and expression and the right to privacy. The index scored Facebook 45/100 and pointed out that Facebook was yet to publish ‘a commitment to adhere to human rights principles’.

What does Facebook’s Human Right Policy state?

Facebook has compiled its myriad policies, guidelines and practices in one place. They state that their respect for human rights is applied through their Community Standards, their membership in the Global Network Initiative, the Responsible Supply Chain Program, Privacy Principles, Data Policy, Law Enforcement Guidelines, Transparency Reporting, Responsible AI efforts and Diversity and Inclusion Practices.

This policy formalises these commitments and that is indeed a welcome step. However, it is a step that is short in substance and remains to be fulfilled in practice especially for jurisdictions such as India. As we reason below there is a dissonance between the promises made by Facebook Inc. and the obligations it has to fulfill them. Others have critiqued Facebook’s policy here and here.

What issues emerge from Facebook’s Human Right Policy

  1. Nothing new: The policies Facebook has compiled are dated. Moreover, these policies have a number of issues that remain unaddressed. For example, Facebook does not necessarily comply with the Santa Clara Principles when enforcing its community standards. These principles require Facebook to issue a notice to a user whose content is removed and provide them with a meaningful chance to appeal the removal. Similarly, transparency reporting by Facebook is obscure as has been highlighted here. Facebook also does not continuously report internet disruptions. As per Facebook’s transparency reporting page, the last internet disruption in India was on 28th June 2020. A number of internet shutdowns orders have been issued after that including ones that shutdown internet in Delhi on Republic Day.
  2. Only one obligation: Facebook has imposed upon itself a mandate to “report annually on how we’re addressing human rights impacts, including relevant insights arising from human rights due diligence, and the actions we are taking in response.” Facebook has not specified the particulars of these annual reports; particularly, it remains to be seen whether Facebook discloses information adverse to it. Crucially, Ranking Digital Rights has reprimanded Facebook on how it conducts human rights due diligence and questioned whether Facebook ‘conducts systematic impact assessments of its terms of service enforcement, on its targeted advertising policies and practices, on its development and deployment of algorithmic systems, or on its zero rating programs.’ Remember Facebook advocated for Facebook Zero or the Free Basics programme in India that was defeated by advocacy spurred by the SaveTheInternet.in movement.
  3. The policy does not provide for a rights audit in India: To address these concerns Facebook could have volunteered for an independent civil rights audit across jurisdictions. Facebook has previously volunteered for an independent civil rights audit after being pressured by US Congress and civil rights organisations. The audit examined Facebook’s efforts to protect human rights on its platforms, evaluated whether Facebook complies with its own policies and provided its recommendations. Similar audits ought to be conducted in other parts of the world and separate reports published. These audits are particularly important for India, where Facebook continues to exercise immense powers, operates in an opaque manner and has been criticized for its actions. However, there is zero public acknowledgement or transparency by Facebook Inc. in India of any such audit.
  4. UNGPs are insufficient for Facebook: Facebook has based the policy on the UNGPs. This is a curious choice because Facebook has immense power (by their own admission). In a paper titled, ‘But Facebook’s not a country: How to interpret Human Rights Law for social media companies’, Ms Susan Benesch (Executive Director of the Dangerous Speech Project, Berkman Klein Centre for Internet & Society, Harvard University) has argued that Facebook should be held to higher standards as “social media companies have human-rights impacts of great magnitude on at least half of the world’s population. They can limit not only what people say (or write) but what they see and read. When exercised in public spaces, such powers are reserved for governments.This extraordinary, transnational power and influence sets the companies apart from any other private enterprise. When platforms are used for exchanging information that is vital for civic life, the owners and staff of the platforms influence the political, cultural, and economic development of entire societies”. Apart from this, UNGPs have faced criticism, especially since they focus on process rather than on outcomes. The Human Rights Watch has questioned UNGPs for their low standards of accountability. Others have called upon organisations such as Facebook to instead adopt instruments under International Human Rights Law, such as the International Covenant on Civil and Political Rights, which contains a framework for imposing restrictions upon the freedom of expression.
  5. Vagueness: Facebook has stated that ‘human rights defenders are a high risk user group’ and has pointed out that it conducts due diligence to identify risks and ‘create strategies to avoid, prevent and mitigate them’. However, the policy does not reveal anything more. We do not know how Facebook protects human rights defenders on its platforms or the methodology Facebook uses to conduct due diligence.

How does the policy impact its largest, global user base?

India is Facebook’s largest market. In Mark Zuckerberg’s own words, "India is a very special and important country for us… [a]nd in fact, we actually test some of our new features here first, before rolling them out globally". The policy is relevant for Indians because:

  1. Facebook has committed to protecting your privacy on WhatsApp: Facebook has stated that it will not “provide governments with direct access or “back doors” to people’s information, and we would challenge any order that sought to have us redesign our systems to undermine the encryption we provide to protect people’s data.” Rule 4(2) of the Information Technology (Guidelines for Intermediaries and Digital Media Ethics Code) Rules, 2021 requires ‘significant social media intermediaries’ such as WhatsApp to enable tracing of the originator of information on their platform if required by the government. This may affect end-to-end encryption and the right to privacy of all Indians. Considering this far-reaching impact, Facebook’s commitment is a welcome development. We unhesitatingly support end to end encryption and any company that implements it to keep Indians safe, secure and their personal data private.
  2. The policy impacts Indian users' freedom of speech and expression: The policy reiterates Facebook’s commitment to its community standards. These standards outline what kind of content is permissible on Facebook. The policy also affirms Facebook’s recognition of the Oversight Board, which recently issued its first round of rulings. Despite the policy and the development of the Oversight Board, Facebook continues to be opaque on how it moderates content at the first instance. As pointed out by Ranking Digital Rights - ‘Facebook should significantly improve its transparency on and accountability for its content moderation by publishing consistent data on actions it takes to enforce platform rules.’ A conversation around this policy will ensure that Facebook undertakes such obligations.
  3. Relevant for Indian start-ups: While Silicon Valley led the developments in information technology over the past decade, Indian startups have started to establish themselves in this field and as the number of users on these Indian platforms grows, they ought to provide for such a policy as they build for more Indians and potentially even the world. Not only is this ethical, but it also makes sense from a business perspective. One of the reasons people are sceptical of silicon-valley based Big Tech companies is because they ignored the impact their products have on society and in the process undermined user trust. In this regard, Indian start-ups can differentiate themselves and build user trust from the beginning by providing effective human rights policies. We urge that their policies not only pay lip-service to the UNGPs but should provide a detailed explanation, make product choices, engineering decisions and implement them in practice.

Why it is necessary to critically evaluate Facebook Inc.

Let us consider the facts. Facebook’s track record in India speaks for itself.

  • In 2016, Facebook introduced the Free Basics program in India to collect vast amounts of data which violated net-neutrality by charging discriminatory tariffs for data services on the basis of content. Facebook withdrew its plans only after efforts by advocacy groups culminated in a ruling by the Telecom Regulatory Authority of India.
  • In 2020, Facebook failed to respond to hate speech on its platform against Muslims which often preceded acts of mob-violence. An article in the Wall Street Journal reported that Facebook’s top public policy executive decided to not enforce hate-speech rules against members of the ruling party. On this issue, we have written to the Parliamentary Committee on Information Technology requesting them to summon Facebook executives, hold a hearing and direct Facebook to repatriate victims.
  • Also in 2020, Facebook’s ‘automated systems’ blocked an account named ‘Kisan Ekta Morcha’ which documented the protests against farm laws on social media.
  • In 2021, Facebook is involved in a legal dispute with the Delhi Legislative Assembly. Facebook had refused to appear before the Peace and Harmony Committee of the Delhi Assembly which had summoned Vice President of Facebook India to inquire into the Delhi Riots of 2020. As Rahul Narayan puts it this case has, “it unwittingly provoked a litigation that may have far-reaching implications on Federalism, the Separation of Powers and Fundamental Rights in India. (link)”. We agree with him that this litigation poses a huge risk to democratic processes and the rights of legislative bodies to demand accountability from large silicon valley corporations.

These are just some events that make us determined to match stated commitments with practiced actions.

Important Links

  • Our blogpost highlighting Facebook’s discriminatory standards for human rights enforcement in the United States and India (link)
  • Representation to Parliamentary Standing Committee on IT regarding the WSJ Story on Facebook India dated August 17, 2020 (link)
  • Ranking Digital Rights 2020 RDR Index (link)

Subscribe to our newsletter, and don't miss out on our latest updates.

Similar Posts

1
Your personal data, their political campaign? Beneficiary politics and the lack of law

As the 2024 elections inch closer, we look into how political parties can access personal data of welfare scheme beneficiaries and other potential voters through indirect and often illicit means, to create voter profiles for targeted campaigning, and what the law has to say about it.

6 min read

2
Press Release: Civil society organisations express urgent concerns over the integrity of the 2024 general elections to the Lok Sabha

11 civil society organisations wrote to the ECI, highlighting the role of technology in affecting electoral outcomes. The letter includes an urgent appeal to the ECI to uphold the integrity of the upcoming elections and hold political actors and digital platforms accountable to the voters. 

2 min read

3
IFF Explains: How a vulnerability in a government cloud service could have exposed the sensitive personal data of 2,50,000 Indian citizens

In January 2022, we informed CERT-In about a vulnerability in S3WaaS, a platform developed for hosting government websites, which could expose sensitive personal data of 2,50,000 Indians. The security researcher who identified the vulnerability confirmed its resolution in March 2024.

5 min read

Donate to IFF

Help IFF scale up by making a donation for digital rights. Really, when it comes to free speech online, digital privacy, net neutrality and innovation — we got your back!