New Technologies in International Law / Tymofeyeva, Crhák et al.

Moreover, UNESCO has encouraged States to take steps to understand and monitor the reasons behind and the sources of misinformation and disinformation 750 . Among other relevant measures, UNESCO has recommended governments to create an en vironment in which it is possible to conduct careful fact-checking and debunking of false or misleading information; providing government support and funding for quality and public interest journalism and counter disinformation campaigns on media and social media platforms; supporting the target audiences of disinformation campaigns; strengthening ethical standards in reporting; educating the public and journalists and empowering them to differentiate between quality news and unreliable information. States also need to ensure people can effectively exercise their right to freedom of ex pression without discrimination, including by protecting individuals against abuses by non-state actors. 751 States should avoid delegating responsibility to companies as adjudicators of content, which empowers corporate judgment over human rights values to the detriment of users. 752 In this regard, states must uphold the principle that intermediaries should not be required to substantively evaluate the legality of third-party content, in line with the Manila Principles on Intermediary Liability. 753 However, companies involved in mo derating online content must uphold their human rights responsibilities, including by carrying out human rights due diligence and ensuring greater transparency regarding, and oversight of, content moderation practices and policies and the algorithmic systems underpinning their platforms to ensure that human rights are respected in practice. 754 5. Leveraging AI and Machine Learning According to Lazarotto two essential strategies for preventing the spread of misin formation are fact-checking and content control. 755 Despite their appearance, they are not the same. Content moderation is a function of social media platforms and is go verned by their policies. Its goal is to identify and delete entries that contain prohibited content. 756 However, the goal of fact-checking is to identify which information about a subject is accurate and which information was presented in error. 757 750 See, UNESCO, ‘Disinfodemic: Deciphering Covid-19 Disinformation’ (2020) accessed 20 October 2023. 751 Human Rights Committee, ‘General Comment No. 34, Article 19: Freedoms of opinion and expression’, UN Doc. CCPR/C/GC/34, (2011), para 7. 752 Human Rights Council, ‘Report of the Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression’, UN Doc. A/HRC/38/35 (6 April 2018). 753 ‘Manila Principles on Intermediary Liability’ accessed on 20 October 2023. 754 See, supra (n 752). 755 Lazarotto B, ‘The Impact of Disinformation During the COVID-19 Pandemic and Its Regulation by the EU’ (2020) 6 EU Law Journal 2, p. 31. 756 Habersaat KB, Betsch C, Danchin M, Sunstein CR, Böhm R, Falk A, Brewer NT, Omer SB, Scherzer M, Sah S, ‘Ten considerations for effectively managing the COVID-19 transition.’ (2020) 4 Nature Human Behaviour 677, pp. 677–687. 757 Barrett PM, Who Moderates Social Media Giants? A Call to End Outsourcing (NYU CBHR, 2020), p. 23.

183

Made with FlippingBook Annual report maker