Last month, WhatsApp added end-to-end encryption for chat back-ups stored on cloud-based services like Google Drive and iCloud. Now, it’s adding another element, with optional access restrictions using password or encryption key.
As you can see in this image, now, when you encrypt your chat back-ups, you’ll also have the option to use a password or encryption key, further locking your data, and keeping your information private.
As explained by WhatsApp:
“No other global messaging service at this scale provides this level of security for their users’ messages, media, voice messages, video calls, and chat backups.”
Neither WhatsApp, nor your backup service provider will be able to read your backups or access the key required to unlock them. Which is a key point for WhatsApp to note, especially given the recent user backlash over its move to update its data-sharing processes, which now sees more data about your interactions with businesses within WhatsApp shared with Facebook.
But that does not, to be clear, apply to your personal message interactions, a key feature of WhatsApp. Your person-to-person interactions in the app still encrypted and fully protected, with this new process adding more capacity to secure your personal data to ensure that no-one, not even law enforcement or other authorities, can see what you’re sharing in the app.
WhatsApp’s original update to encrypt chat back-ups closed off a key loophole in this respect, with law enforcement agencies previously able to tap unencrypted chat back-ups on Google and Apple servers, in order to access user information and interaction history. That, according to some regulators and officials, serves an important purpose, with Facebook’s gradual move towards more encryption offering more protection for criminals who use their messaging services to conduct illegal activity.
The National Society for the Prevention of Cruelty to Children, for example, has argued that any move to further restrict such access by law enforcement will increase the potential for use of these platforms among perpetrator groups.
As per NSPCC chief executive Peter Wanless:
“Private messaging is at the front line of child sexual abuse, but the current debate around end-to-end encryption risks leaving children unprotected where there is most harm.”
Government agencies agree. Back In October 2019, representatives from the US, UK and Australia co-signed an open letter to Facebook which called on the company to abandon its full messaging encryption plans, arguing that it would:
“…put our citizens and societies at risk by severely eroding capacity to detect and respond to illegal content and activity, such as child sexual exploitation and abuse, terrorism, and foreign adversaries’ attempts to undermine democratic values and institutions, preventing the prosecution of offenders and safeguarding of victims.”
The Governments of each region have asked for Facebook to provide, at the least, ‘backdoor access’ for official investigations, which Facebook has repeatedly refused.
Facebook reinforced its push on this front in August, by adding end-to-end encryption for voice and video calls in Messenger, while it also launched a new experiment that would see the expansion its encryption options to Instagram Direct.
Which, given Facebook’s broader plan to integrate all of its messaging services, is inevitable – but that project continues to raise the ire of law enforcement advocates, who view the move as a danger to the public.
But again, Facebook continues to push on, and with the additional motivation of connecting its back-end systems in order to avoid a possible break-up of the company’s components, if it was ever found to have breached anti-trust rules, it seems that Facebook will indeed encrypt all of its messaging services, despite opposition.
Is that a good thing? I mean, the broader push towards data security and control is another element, and Facebook’s updates align with that shift. But it may well, as noted, offer additional protections for criminal networks, who’ll then be able to use more platforms, with more reach, without fear of being detected.
That seems like a significant downside to increased privacy, but these are the impacts that need to be balanced in the evolving debate around data usage.