Much of the debate around the spread of misinformation and online harassment has been focused on the biggest social media platforms, including Facebook, Twitter, Instagram, Youtube and, more recently, TikTok. Messaging apps, like WhatsApp, Facebook Messenger, and WeChat, and the increasingly popular Telegram and Signal, have nearly as many users as these platforms and are also rife with disinformation, hate speech and even incitements to violence. At the same time, they are also vital tools for activists and journalists the world over, seeking to ensure they can communicate safely and secretly.
As Facebook and Twitter have started doing more to curb their problems, extremists, activists and conspiracy theorists are moving to messaging apps that are more secure, and less censored. Is this a positive for freedom of speech and the right to privacy? Or another blow to the fight for access to accurate information and freedom from discrimination? The complex nature and use of these apps deserves an equally complex response, and certainly more attention, from anyone seeking to protect human rights in a digital age.
From protecting free speech and providing civic space to hosting extremists and spreading disinformation
Messaging apps are not just used for one-to-one messaging (or calling). Groups and channels make it easy to quickly spread messages to a broader audience. In terms of total number of users, WhatsApp is almost as big as Facebook and YouTube. Telegram, WeChat, QQ, Snapchat, and Facebook Messenger are all bigger than Twitter. WhatsApp was the most downloaded app from 2016–19, pipped to the post in 2020 only by TikTok. WhatsApp is the most popular messaging platform worldwide, and in most individual countries, and the most used social media platform more generally in many markets, including in Nigeria, Germany, and Colombia.
In fact, the US stands out in its low usage of WhatsApp, especially compared to Latin America, Africa and Asia. This may contribute to the fact that less attention is paid to the role of messaging apps in disinformation campaigns, given that much of the debate around social media and disinformation is focused on the issues in the US.
WhatsApp, Telegram and Signal have become popular because of their use of end-to-end encryption (although not by default on Telegram), meaning, in theory, only the sender and intended recipient can read messages. This makes surveillance and interception of messages much more difficult. Unlike in other messaging platforms, there is no third party involved in the encryption, so law enforcement or intelligence agencies cannot get access to messages through the technology companies.
As a result, private messaging apps have long been the preferred tool of communication for democracy, human rights and environmental activists, and for journalists seeking secure lines of communication. Where Iran’s Green Revolution and the Arab Spring became known as Twitter and Facebook revolutions, the protests and revolutions of the past half decade are much more likely to have been organised over WhatsApp, Telegram or Signal. Telegram groups were used to organise civil disobedience in Hong Kong. Members of Extinction Rebellion primarily used Signal for internal communication, and turned to Telegram when they wanted to communicate with bigger groups. When Telegram was blocked in Russia in 2018, members of Pussy Riot and Aleksei Navalny were among the thousands who joined demonstrations.
But since their founding, encrypted messaging apps have also been popular with more nefarious actors. ISIS used Telegram to claim responsibility for terrorist attacks, and a range of other encrypted messaging apps to radicalise, recruit and organise. These apps have also had problems with the spread of child abuse imagery. While the debate around the spread of disinformation in the 2016 elections and referendum in the US and UK mostly focused on Facebook, WhatsApp has become a new frontier in disinformation, including in relation to democracy. It has been blamed in particular for the spread of ‘fake news’ and ‘incendiary messages’ in India and Brazil.
The events of the past year have brought these tensions to the fore – and to the US
In the first weeks of the pandemic, WhatsApp saw a 40% surge in usage. It also became a key vector for Covid-related misinformation around the world. Signal downloads spiked during Black Lives Matter protests over the summer. Outside of the US, protests in the latter half of the year in Belarus became known as the ‘Telegram Revolution’.
Should we be concerned that far-right extremists are moving to these apps with even less moderation than the platforms on which they have already flourished? On the one hand, it is certainly harder to monitor, and so to combat, extremism, hate speech and disinformation when more of it is occurring in end-to-end encrypted chats. On the other, these actors are likely to have less reach off of mainstream platforms. Without the help of YouTube and Facebook algorithms, it becomes much harder to find extreme content, whether on groups or channels on messaging apps, without knowing what you are looking for. Extremists have long existed on messaging apps, and will continue to do so, but the hope is they will radicalise fewer people if they are limited to private messaging.
A balancing act
These apps have consistently served both benign and dangerous purposes. As their popularity has grown over the past year, this duality has only been reinforced. Could these apps be doing more to tackle their problems? They have started to take some steps to stem the flow of disinformation and limit the uses of their apps to incite violence and other abuses. WhatsApp introduced policies to limit forwarding and encourage users to fact check messages. Signal too, has set limits on forwarding and group sizes, although they can be much larger than on WhatsApp. Telegram has blocked ISIS channels and American channels calling for violence, but its founder is also an ‘extreme libertarian’, according to the New York Times, and has refused to cooperate with authorities in several countries.
This duality has made encrypted messaging apps a cause for concern for governments around the world, both democractic and less so. In 2016, Freedom House reported that ‘In a new development, the most routinely targeted tools this year were instant messaging and calling platforms, with restrictions often imposed during times of protests or due to national security concern.’ Governments in at least a dozen countries, including Iran, Russia and Indonesia, have attempted to ban or censor Telegram and WhatsApp has been blocked in a similar number. Others, including Australia, India, the US and UK, have called on tech companies to provide an ‘encryption back door’ to allow law enforcement access to encrypted messages.
Most experts seem to agree that these apps are still a net good for communication, and ultimately for human rights, in protecting free speech and giving a space for activists to organise. Any efforts by governments to regulate private messaging apps must acknowledge and balance these concerns carefully. Democratic governments must recognise that their calls for a ‘backdoor’ into encrypted messages risk legitimising the efforts of more authoritarian States to ban or censor these apps, in full knowledge of how powerful they are.
What does this mean for defending human rights?
Arguments about how to handle these apps go right to the heart of the challenges of balancing vital rights, such as freedom of expression and the right to privacy, in the digital era. They make it even more obvious that we do not yet have the answers. It is clear that WhatsApp, Telegram, Signal and their ilk cannot be left out of the conversation about the ills of social media, its influence and potential for harm – online and offline. But even more so than Facebook, Twitter and YouTube these days, messaging apps are also powerful connectors and havens for free speech. In places where this is all too lacking, they offer vital access to uncensored information and to a civic space in which people can organise.
The diversity in how these apps are used around the world mean national legislators may end up taking too one-sided a view of these apps depending on their national context: seeing them either as dangerous havens for hate speech and extremism, or dangerous windows for free speech and organising. Regional and international organisations are well placed to balance these perspectives, especially if working in cooperation with global civil society. They should also play a role in facilitating cooperation between governments and the private companies that own these apps, given their often hostile relationships in the past. Any conversations must fully acknowledge the power of messaging apps for both good and bad, and think beyond simplistic views of national interest.
Share this Post