Bahai.ca Français

Testimony to the Standing Committee on Justice and Human Rights (JUST) 42nd Parliament, 1st Session (Regarding Online Hate), 2019

Download as PDF

Geoffrey Cameron, Baha’i Community of Canada

Standing Committee on Justice and Human Rights (JUST) 42nd Parliament, 1st Session

Meeting 143

Thursday, April 11, 2019, 9:45 a.m. to 10:45 a.m. Room 420, Wellington Building, 197 Sparks Street

Online Hate

I would like to thank the committee for inviting my testimony today, as a representative of the Baha’i Community of Canada. I am also appearing as a member of the Executive Committee of the Canadian Interfaith Conversation, a national body that seeks to foster and promote religious harmony and dialogue.

Baha’is, as members of a religion that has been in Canada since the late 1800s, and which has established communities in most localities in the country, are not the targets of online hate in Canada.

However, this issue is of particular concern to our community, first and foremost, because of core teachings of the Baha’i Faith regarding the promotion of the fundamental oneness of humanity, and the elimination of all forms of prejudice. Public or private expressions of hatred towards groups of people, whether online or offline, is inimical to these beliefs.

We have joined with many other faith and civil society groups to call for the study of the root causes and potential solutions to the rising incidence of online hate, which has been directly connected to violent attacks on particular groups. Women, Muslims, Jews, Sikhs, and racial minorities have been among the most recent targets of violence that was inspired by hatred spread online.

The recent attacks on Muslims at prayer at two mosques in Christchurch, New Zealand; the van attack in downtown Toronto; the attack on Jewish worshippers at the Tree of Life synagogue in Pittsburgh; and the shooting at the Islamic Cultural Centre of Quebec City – all are recent examples of killers who spent extensive time in digital worlds of hatred.

As Professor Richard Moon has found, “hate crimes are committed most often… by individuals who have immersed themselves in the extremist subculture that operates at the margins of public discourse, and principally on the Internet.”

Sadly, this is also a problem with which Baha’is have first-hand experience in other countries. In the most egregious case of Iran, a government-supported media campaign of defamation and incitement to hatred has been directly tied to outbursts of violence and murder targeting Baha’is. A similar pattern has begun to proliferate in nearby Yemen.

It is clear from a growing body of experience that the spread of online hatred targeting a defined group can lead individuals, who are already perhaps inclined to bigoted thinking, to act with violence.

What should be done about this problem? Any lasting solution somehow has to take into consideration the roles and responsibilities of individuals, groups and corporations, as well as the institutions of government.

With regard to government, I will refrain from commenting on the question of whether Section 13 of the Canadian Human Rights Act should be reinstated, or whether the hate speech provisions in the criminal code are sufficient to prosecute cases of online hate. There is a delicate balance to be struck between guaranteeing the free exchange of ideas in the public sphere, and sanctioning those whose aim is not to advance truth, but to spread hatred. Clearly the government, and by extension the courts, has a role to play in prosecuting cases of hate speech.

It is also increasingly clear that policy intervention by government is needed to mitigate the impact of the more egregious misuses of online social networks. Despite recent steps taken by Facebook and Twitter to remove certain accounts, government also has a role to play in regulating these platforms. Any effective policy intervention must also ensure national and local community involvement in determining the standards for online platforms. As David Kaye, UN Special Rapporteur on the right to freedom of expression, has urged, relying on international human rights norms rather than the arbitrary judgements of the commercial platforms themselves is a better basis for the development of such standards. This includes delineating the rights and responsibilities of users, as well as safeguards to ensure that freedom of expression is not unduly curtailed.

However, government action by itself is insufficient. There is also a role for civil society in pushing these companies further in the right direction, beyond the letter of the law. One organization, called Change The Terms, has called on tech companies like Facebook, Google and Twitter to take steps to curb the use of social media, payment processors, event-scheduling pages, chat rooms and other applications for hateful activities. There are concrete steps that can be taken by these powerful companies – which are accountable both to government and to the wider society - that can create a healthier public sphere for all of us.

Finally, there is an educational responsibility that falls to community leaders, teachers, families, and parents. Changes in the attitudes, values, and behaviours of individuals is a necessary part of the solution.

The online environment is ultimately a mirror reflection of our society. We live in a world in which prejudice against certain groups is propagated by many people, even those who do not intend to provoke violent actions.

Religious leaders have a particular responsibility to educate people to promote fellowship and concord, and not to stoke the fires of fanaticism and prejudice.

Young people, especially, need access to education that teaches them, from the earliest years, that humankind is one family. They require education and mentorship that goes beyond a simplistic condemnation of hatred or a set of dos and don’ts regarding their online activities. Youth need help to develop a strong moral framework within which they can make decisions about their online activities, like which content they choose to share and consume, and how they use their powers of expression when communicating with friends and strangers.

Any long-term solution to online hatred has to give due consideration to the generation that is presently coming of age in an information environment that is confusing, polarizing, and indifferent to their moral and ethical development. From where do young people learn to express themselves, using language that is intended to educate rather than dismiss or denigrate? As they seek to learn about social issues, how will they know the difference between intelligent criticism and hateful propaganda? What ethical tools and social support are we giving to them as they navigate the online world?

Answering these questions is a responsibility that does not fall only to government, and it is part of a response to online hate that we must all accept to carry forward.