The Battle Inside Secure-Messaging App Signal

Monday 8 February 2021

Earlier this year, WhatsApp, the secure messaging app owned by Facebook, announced it was updating its terms of service. The change, allowing Facebook and other businesses to store messages, created a lot of alarm and set off a search for more securing messaging services. One of the biggest beneficiaries has been Signal. The encrypted messaging app saw an additional 2 million users in a 12 hour period after Facebook first announced these policy changes. Since then, the messaging app jumped to #1 in the App Store in 70 countries and reports are that they have doubled their user base since these WhatsApp policy changes. 

However, the rise of Signal and other encrypted messaging apps like Telegram (reportedly added 90 million users in January) raises questions. There is a trade off that predates the internet between privacy and regulation. Err too far on the side of privacy and illicit and harmful behaviour goes unchecked. Err too side on the far of regulation and people lose their freedom. This was the same civil liberties debate that played out post-9/11. It is same debate that Edward Snowden sparked after his NSA leaks. A similar debate is now playing out with their encrypted, highly secure messaging apps. 

Signal now finds itself right in the middle of this debate. Started as a non-profit, and actively rejecting the data-hungry, ad-fueled business models of Facebook and Google, Signal prioritises user privacy over all else (even profit, Signal relies on donations). Not even Signal can see the messages being sent on their app. This has been a godsend for activists, dissidents, journalists and marginalised groups around the world. Signal has leant into these use cases, including adding a feature that blurs out faces of protestors to allow activists to share photos of protests without the worry of government retribution. 

The concern is that many of these features could be open for abuse. Signal CEO, Moxie Marlinspike, has actively resisted creating a content moderation policy. This opens up the app for abuse by criminal or extremist groups. WhatsApp co-founder and original funder of Signal, Brian Acton, actively resists becoming a “nanny company”. He has said, “Insofar as people use a product in India or Myanmar or anywhere for hate crimes or terrorism or anything else, let’s stop looking at the technology and start asking questions about the people.” 

While the sentiment may be noble, it does come off as a little naive. The implications of misuse can be catastrophic (see: Facebook’s use in a genocide in Myanmar). This story is still being written and Signal and Telegraph will be important case studies to watch. 

One thing that is clear at this point. Remove the data-hungry, ad-fueled business model from social media companies and they face many of the same problems of their venture-capital funded cousins around illicit content and dangerous misinformation. It seems likely that if Signal or Telegraph achieve the scale that Facebook and Twitter have, they will face similar issues even with their different set of priorities.

Leave a Comment