Skip to main content
x

Statement by United Nations High Commissioner for Human Rights, Michelle Bachelet at the 13th Session of the Forum on Minority Issues: Hate speech, social media and minorities”

Back

19 November 2020

Geneva, 19-20 November 2020

Dear President of the Council,
Dear Chair,
Distinguished panellists,
Delegates and participants,

Let me begin by congratulating Chair Natalie Alkiviadou for her nomination and extending my greetings to all of you on this Forum on Minority Issues.

I welcome this year’s focus on hate speech, social media and minorities.

This is a topic as challenging and grave as it is timely.

Digital technologies have profoundly changed our lives and the social and political landscape in which we live.

Opportunities for exercising our fundamental freedoms of expression, association and participation have expanded in unparalleled ways. Yet, this expansion has brought with it new and significant threats to civic space and to people’s rights.

One of them is hate speech, which is largely disseminated online through various social media platforms.

Minorities have been disproportionately targeted with incitement to discrimination, hostility and violence. This may lead to tensions, unrest and attacks against individuals and groups. It may also be used to serve certain political interests, contributing to a climate of fear among minority communities.

Earlier this year, the Secretary-General denounced “a tsunami of hate and xenophobia, scapegoating and scare-mongering,” unleashed amid COVID-19. Many of the targeted groups were already facing the disproportionate impact of the health and economic crisis.

Hate speech was not born with the pandemic. Last year, in response to alarming trends around the world, the UN Secretary-General launched the UN system wide Strategy and Plan of Action on Hate Speech.

The Plan of Action is grounded on four key principles:

  1. It stresses the importance of freedom of opinion and expression. To address hate speech, we need more speech, not less;
  2. Tackling hate speech is the responsibility of all;
  3. We need a new generation of digital citizens, empowered to recognize, reject and stand against hate speech;
  4. To act effectively, we need to know more. This calls for coordinated data collection and research, including on the root causes, drivers and conditions conducive to hate speech.

 We face a complex challenge.

Around the world, we see two scenarios. On one hand, evident cases of incitement to violence do not get prosecuted; on the other, hate speech allegations can also be abused as an excuse to persecute anyone daring to criticize the authorities.

States have the obligation under international human rights law to act against incitement to discrimination, hostility and violence, while fully respecting freedom of expression.

And the same rights that people have offline must also be protected online.

Tackling “hate speech” risks being abused, imposing uniformity of views, curtailing dissent and shrinking civic space.

In the past years, we have witnessed measures used to restrict fundamental freedoms, democratic governance and severely limit civic space. In a number of countries, governments employ new and existing laws and policies, including related to security, counterterrorism, hate speech and defamation, to silence dissenting voices and impede the work of human rights defenders and activists speaking on behalf of minority groups. I reiterate my Office’s full commitment to support and protect these defenders.

Restrictions of these rights extend into the online sphere through the adoption of laws and regulations, with overly broad provisions, that have a negative impact on media reporting and the overall ability for people to raise legitimate and critical issues of public concern.

As the former Special Rapporteur on Freedom of Expression stated in his October 2019 report to the General Assembly: “…freedom of expression, the rights to equality and life and the obligation of non-discrimination are mutually reinforcing...” 

So, the critical questions become: how can States manage their obligations to both uphold fundamental freedoms and democratic space yet, at the same time, address hate speech?

And, at which point does free speech become hate speech and, if required, what should the legal or judicial response be?

As I have mentioned, it is not always straightforward to qualify hate speech. This is why the UN Human Rights Office organized regional and global workshops with experts to develop a framework to help assess statements, on a case-by-case basis, both online and offline. The framework, known as the Rabat Plan of Action, takes six criteria into account: context, speaker, intent, content, extent of the speech, and likelihood of harm. Any restriction should meet with another set of criteria: legality, legitimacy, necessity and proportionality.

One of its main purposes is to help identify incitement to hatred, while fully preserving freedom of expression.

On the role of social media companies, the UN Guiding Principles on Business and Human Rights state that companies have a responsibility to prevent, mitigate and remedy human rights violations that they may cause or contribute to. The former UN Special Rapporteur on freedom of expression recommended companies to adopt content policies that tie their hate speech rules directly to international human rights law, including UN treaties and interpretations such as the Rabat Plan of Action.

The six factors of the Plan can help companies both assess whether a restriction is warranted and determine what an adequate, rights-respecting response would look like.

Social media companies have alternatives to either taking down or leaving material online. They can also flag content, add countervailing material, warn the disseminator and suggest self-moderation. Take-downs would only be warranted in the most severe cases.

Any solution proposed to tackle hate speech in social media should work towards closing an enormous gap in transparency and democratic accountability in the decision-making of the platforms. Not only should we expect them to follow human rights guidance, but we also need mechanisms to monitor and assess their acts.

Friends,

Tackling hate speech is the responsibility of all.

States should effectively implement their obligations and responsibilities to protect the human rights of minorities, who are disproportionately targeted.

The engagement of political and religious leaders, the private sector and civil society is also crucial.  

Minorities themselves and civil society at large must be consulted and participate in the shaping of laws, policies or programmes. Negotiations on content regulation must be done publicly, framed by truly democratic processes.

And we need to do more to address hate speech on social media and prevent a dangerous escalation into incitement to discrimination, hostility and violence against minorities.

Internet companies and social media platforms play a critical role.

The UN Guidance Note on Addressing and Countering COVID-19 related Hate Speech adopted last May, sets out recommendations to various actors, including that social media and tech companies should ensure that their hate speech policies involve an evaluation reflecting the six-part threshold test contained in the Rabat Plan of Action.

My Office is also currently developing a guide for legislators on comprehensive anti-discrimination law and protection of minorities, which will include discussing the line between freedom of expression and the ban on discrimination.

In my recent letter to European Commission President von der Leyen, regarding the consultations on the EU Digital Services Act, I called for rules and processes that enable everyone to participate in the digital world, promoting transparency on content moderation mechanisms. I also expressed my concern for members of at-risk and marginalized communities, calling for the new law to protect their rights to effective and accessible remedies. It is vital to ensure that any regulations involving expression be firmly rooted in international human rights law.

There are also many non-legal responses, like public awareness campaigns, human rights education, and the promotion of diversity and pluralism.

Influential figures in society should actively speak out against hate speech and express solidarity with those targeted by such expressions. In this context, the “Faith for Rights” framework focuses specifically on the human rights responsibilities of faith actors and earlier this year, we launched the #Faith4Rights toolkit and related peer-to-peer learning.

Everyone, everywhere, can -- and must -- stand against hate and stand up for human rights.  

My Office will continue working to address all forms of hate speech, including through social media, while protecting freedom of speech.

In that spirit, I wish you a fruitful session and look forward to hearing about the outcome of your deliberations.

Thank you.

Back