Facebook Messenger’s New ‘Dangerous’ Update: Why You Should Worry


If you’re one of Facebook Messenger’s 1.3 billion users, Meta’s determination to radically change your app gained momentum this week. Despite multiple warnings that this update is a dangerous step in the wrong direction, Meta isn’t listening.

We’re talking end-to-end encryption, of course, and Meta/Facebook’s ongoing program to bring the same level of security that protects WhatsApp to Messenger and even Instagram. Those plans, first announced in 2019, have apparently been plagued with technical challenges. Global rollout not expected before end of 2023.

Beyond the technical workload, Meta’s tug of war with this plan has paid off. Lawmakers and security agencies are already lamenting their inability to penetrate end-to-end encryption, describing it as “a gift to terrorists and pedophilesand they don’t want to see the problem get worse.

Meta’s latest foray was to order a two-year report of Business for Social Responsibility (BSR) on the impact of such encryption on human rights, which concluded that “the expansion of end-to-end encryption enables the realization of a wide range of human rights and recommended a series of integrity and security measures to deal with conflicting human rights.

This report is true, and yet Meta is wrong. Yes, we should all use end-to-end encrypted messaging apps, but no, not all messaging apps need to be end-to-end encrypted. Meta’s WhatsApp is the biggest proponent of such security and has 2 billion users compared to Messenger‘s 1.3 billion. The penetration of WhatsApp users in developing and autocratic markets, where the need for such security is more acute than in the West, is quite universal. Meta already meets the needs advocated by this report.

Further, the report’s conclusion that expanding encryption enhances “privacy, freedom of expression, protection against cybercrime threats, physical security, freedom of belief and religious practice, and protection against state-sponsored surveillance and espionage,” has been served by the growth of apps such as Signal and Telegram, although ironically this one is not end-to-end encrypted.

So what’s the difference? Simply put, while full end-to-end encryption makes absolute sense on dedicated messaging apps, like WhatsApp and Signal, the same is not true when tied to social media platforms. I can’t browse WhatsApp, look at profiles and photos, select people to click on to contact them. I cannot hide my identity behind a fake profile in WhatsApp. And, most importantly, WhatsApp isn’t a sticky platform for kids like Facebook and Instagram (as well as TikTok) are.

As children’s charity NSPCC has warned, this proposed Messenger update risks “failing to protect children from preventable harm”, that “10% of sexual offenses against children on platforms owned by Facebook take place on WhatsApp, but they account for less than 2% of child abuse cases the company reports to police because they can’t see the content of messages,” making this new update very high risk. We saw this last year, where Facebook’s evidence was key to capturing “one of the most dangerous pedophiles on the webwhich investigators say would not have been possible with end-to-end encryption.

Facebook’s answer is to increase AI metadata analysis. “We build strong security measures designed to prevent harm from happening in the first place,” the company told me, “and give people the control to react if it does happen. Working together also gives us more information to identify abusive accounts and allows us to introduce security features… like restricting interactions between adults and minors.

Unsurprisingly, the new report echoes this, recommending in Meta’s words that “we continue to invest in effective damage prevention strategies such as metadata and behavioral analytics, user education, and feedback reporting.” ‘robust users, among other tools’.

That said, the report Is criticizing client-side scanning, which she said would “undermine the integrity of E2EE and disproportionately restrict people’s privacy and a range of other human rights.” Bad news for Apple, which has embarked on the path of client-side analysis. Ironically, in this regard, Meta’s criticism of Apple on the privacy front is well justified. Client-side monitoring is a bad idea. Period.

In response to child safety concerns, Meta now said “The impacts of E2EE go far beyond such a simplistic ‘privacy vs. security’ or ‘privacy vs. security’ framework. Again, this is true when it comes to dedicated messaging, but wrong when it comes to social media platforms. Meta assures that “our machine learning technology will examine the unencrypted parts of our platforms, such as account information and photos uploaded to public areas, to help detect suspicious activity and abuse”. But that won’t be enough – it has marginal effectiveness where the content itself is “dark”.

As one WhatsApp insider told me, “Half my day is explaining to people that WhatsApp isn’t a social network,” and that’s the difference they’re referring to.

There is some acknowledgment of this in Meta’s indefinite hiatus on “Instagram Kids”, and in response to this report it says “we have not yet determined how and whether to implement E2EE in Messenger Kids, we’re committed to maintaining the same strong parental controls in Messenger Kids as we expand E2EE, including the ability for parents to control who their children can message and when.

And this is the crucial point. Facebook Messenger‘s end-to-end encryption is not required. It is not a dedicated messaging app, but a communication tool linked to a social media platform. The trade-off between security and privacy suits WhatsApp, but not Facebook itself. Also, Facebook has admitted in the past to monitoring Messenger traffic, I don’t see much controversy in a client-server/server-client encryption architecture, just like Telegram works, with some form of automated CSAM or child safety analysis in the middle .

As the BST report says, we should promote access to end-to-end encryption for everyone, and its use has made users safer and more secure. But let’s not see that as an absolute, it’s not. “If content is shared and we don’t have access to that content,” a Meta official warned last year, “if it’s content we can’t see, then it’s content that we cannot report”.

For Meta, there is another risk inherent in their comprehensive Messenger and Instagram encryption plans. There is currently an uneasy standoff between technology and regulators over encryption. Barring EU overruns or something similar, WhatsApp or Signal encryption is unlikely to be broken. But if Meta insists on extending this encryption, it seems likely that regulators and lawmakers will demand a compromise. And this compromise will likely affect the existing encryption as well as any extension. It might serve Meta’s desire to build an in-app (non-destructible) messaging giant, but it won’t serve WhatsApp’s 2 billion users.


Comments are closed.