"Empathy and love can make a difference against hate speech"
At the ZHAW digital event, everything revolved around hate speech, fake news and their effects on society. Three experts talked about what is being done to combat hate on the internet and which digital technologies can even be used to help.
The need to protect the population from hate speech and disinformation on the internet requires a broad discussion in Switzerland. That is why the Federal Council last week instructed DETEC to show whether and how communication platforms could be regulated. According to the Federal Council, the Swiss population fears that they will be exposed to more false news on social networks and video portals.
Under this current development, ZHAW digital invited to the event "Hate via Messenger: Hate Speech, Political Polarization and Democracy". The experts Sophie Achermann, Judith Möller and Céline Külling discussed many of the participants' questions.
Empathy and algorithms against hate speech
Sophie Achermann is executive director of alliance F, the largest and oldest umbrella organisation of Swiss women's organisations. She is co-project leader of the Stop Hate Speech project, which aims to combat hate speech on the internet by combining technical and civil society approaches. The project is developing an algorithm to automatically classify hate speech. A project at the ZHAW School of Engineering also shows that this is not so simple.
Achermann also reported on strategies for countering hate speech with targeted counter-speech. To do this, she and her team responded to hate comments on Twitter, using different forms of communication in each case, such as humor or sarcasm. One strategy was particularly successful: when they were empathetic, i.e. when they reacted to the tweets in a particularly empathetic and understanding way, more original tweets tended to be deleted afterwards. "It's nice to know, and also obvious, that empathy and love can make a difference against hate. Our language has an impact on other people," Achermann said.
Algorithms and the YouTube "Rabbithole"
The second expert of the evening, Judith Möller, is Associate Professor of Political Communication at the Department of Communication Studies at the University of Amsterdam and Adjunct Associate Professor at the Department of Sociology and Political Science at the University of Trondheim. She focuses on the impact of political communication, particularly on social media, and spoke during the event about how algorithms, such as those used by Facebook and YouTube, can help create communities that promote hate and false news.
On social media, the access and source of information is much more diverse than in journalism, but not all lead to democratic dialogue, Möller said. Online media are also optimised in terms of their performance, she said, which should lead to more clicks, longer dwell times and more interactions. The emotions generated in each case are also important, she said. "Anger is a very mobilising emotion. Content that makes people angry is clicked on more often. There is also a financial interest behind it. That's why anger is an important emotion for these platforms," Möller said.
On the video platform YouTube, this can also trigger an information spiral. Once people start responding to such videos, an algorithm jumps in to alert them to similar content. "It's known that this algorithm is fast and highly radicalising. If you follow the recommendations, you quickly end up with videos that are much more extreme," Möller explained.
Hate Speech vs. Cyberbullying?
What is the difference between hate speech and cyberbullying? Céline Külling from the ZHAW Department of Applied Psychology is co-author of the JAMESfocus report 2021 on hate speech among young people. "With cyberbullying, you usually know the perpetrator personally; with hate speech on the internet, it's more anonymous. Cyberbullying is often tied to bullying in everyday life and is directed at a specific person. Hate speech tends to be directed at entire groups," Külling explained.
Responsibility on the part of tech companies
The experts agreed that everyone in society must be involved in countering hate speech and fake news. "No one should lie on their lazy skin. But the role of technology needs more transparency," Möller said. Achermann has a similar view: "Tech companies should be transparent to the extent that research can be done with the data." But civil society and the federal government are also called upon here, she added. Achermann's project shows that everyone can become active. With more empathy, a lot can be achieved online.