We submitted our comments on the ‘UNESCO Guidelines on Regulating Digital Platforms’ on January 20, 2023. Our broad comments relate to the need to ensure that risks to freedom of speech and expression as well as privacy as mitigated, avoided, and minimised to whatever extent possible, especially given the government laws, rules and regulations, and restrictions. Our specific comments relate to the challenges of establishing an independent regulatory system, ambiguity around certain terms and concepts, capacity limitations of academic institutions, civil society, regulatory systems and governments, etc.
UNESCO released the 1st draft of the “Guidelines on Regulating Digital Platforms: a multistakeholder approach” (guidance document) with the aim of supporting freedom of expression and the availability of accurate and reliable information in the public sphere. Comments and inputs were invited on the guidance document till January 20, 2023. After analysing all the comments received, UNESCO will be sending the 2nd draft to all Internet for Trust Global Conference participants. The UNESCO Global Conference is scheduled to be held between February 21st and 23rd at the UNESCO Headquarters in Paris. The guidance document will focus on structures and processes to help protect users from content that damages democracy and human rights, while at the same time respecting freedom of expression
Source: UNESCO Digital Library
The goal of regulation:
We praised UNESCO for recommending that “governments should be open, clear, and specific about the type and volume of requests they make to companies to remove and block content”. While recognising the need to scrutinise the scope of government requests to remove or block content, the Guidance also suggests finding means to deal with “harmful content that may damage democracy and human rights work”. While we appreciate that the phrase “harmful content” has been elaborated upon to include examples of what it could constitute, thus reducing the possibility of weaponization of the concept in absence of a globally agreed upon working definition, significant risk of such an occurrence still exists. Moreover, while the proposed guidance rightly calls for more transparency from platforms, it must also push for transparency to be meaningful. We recommended that the information and data provided by platforms must be actionable, and such that it may be used for further research and analysis.
Fulfilling the goal:
We applauded UNESCO for introducing a responsibility on the government to not subject staff of companies to criminal penalties for an alleged or potential breach of regulations, as it can have a chilling effect on freedom of expression. This is especially important in the Indian context as the Chief Compliance Officer (CCO), who is appointed by an intermediary and is legally required to reside in India, is responsible for ensuring compliance with the Information Technology Act, 2000 (IT Act, 2000) and the rules made thereunder, and is liable for proceedings in this regard.
Structures and processes: Our comments related to the following structures and processes outlined in the guidance document:
- Content management policies: We appreciated the recommendations in the guidance document regarding the Santa Clara Principles On Transparency and Accountability in Content Moderation, but we also voiced our concerns about the definitional issues with some terms as well as the potential risk to individual's right to anonymity. We noted the challenges in implementing the reporting requirements, especially in the context of India's rules for website blocking. We appreciated the emphasis on user rights and understanding of the reason for content moderation.
- User reporting: UNESCO's goal of prioritising user reports of threatening or intimidatory content is commendable, but the terms are subjective and there is a risk of bias and individual motives influencing the reports. We questioned the platforms’ ability to make informed decisions quickly, also because UNESCO's suggestion to develop automated systems to process and record complaints is limited by the challenges and limitations of the technology. There is also a concern that lack of nuanced analysis may amplify unfair biases and result in overbroad censorship. We note that the development of a competent and independent regulatory system is necessary to address these challenges.
- Election Integrity: We highlighted that digital platforms can have a significant impact on people's behaviour and the election process, and the lack of regulations on digital media campaigns makes it difficult to analyse. We suggested that a common definition of a political advertisement is needed, and digital platforms should be required to provide information on expenditures for political ads and promotions, including demographics targeted and amounts spent on monetized posts. We also recommended that digital firms must establish a transparent complaints mechanism and conduct outreach programs to familiarise users with the process, and make their complaint handling process public to enable auditing.
- Data access: The proposal by UNESCO to provide access to non-personal and anonymized data on platforms has potential risks. The distinction between personal and non-personal data is unclear and can lead to implementation and legal issues. The threat of de-anonymization is significant, as direct attacks may reveal sensitive information, and indirect attacks use available data to deduce personal information not explicitly present in the original data. Despite the safeguards added by UNESCO with the phrase "wherever safe and practicable," the risk of making this data public is high. The capacity constraints in developing economies also limit the ability to use this data for research purposes.
The independent regulatory system: We shared our firm opinion that a regulator or regulatory system, unless truly independent, may end up undermining the freedom of speech and expression. We recognised that on one hand, a self-regulatory mechanism holds the risk of making private entities arbiters of free speech, while on the other hand, a regulatory model directly or indirectly influenced by the government runs the risk of making the executive arbiters of permissible free speech. Based on the learnings from India’s effort to introduce an oversight mechanism, we suggested that any attempts to introduce a regulatory mechanism must be in the presence of statutory backing and rules which can truly ensure the independence of such a body. A further step towards ensuring the independence of the regulatory system could be the introduction of an obligation on the system to publicly disclose all its decisions, as well as the justification and reasoning behind it.
We recognised the urgent need to respond to the dynamic changes and progress in the digital ecosystem, and the need to ensure that we are not immobilised due to the complex nature of the technology as well as the complicated social and economic trade-offs. On that note, we cautioned against the rushed passing of over-broad, vague, and ambiguous laws of general applicability that will further complicate these unfortunate consequences. To that end, we suggested that the regulation framework must first and foremost abide by internationally accepted human rights standards. Governments must also refrain from directly undertaking moderation of online content, while establishing genuinely independent authorities as a check against concentration of power by a few social media platforms. Moreover, additional safeguards for when content is removed may be provided such as an obligation to provide reasoned order, a right to be heard to the content creator and the right to appeal the decision of any regulatory or adjudicating authority.