Yesterday, I ran WhatsApp on my phone, and this appeared:
It was only on my screen for a second or so, but it made to think about WhatsApp, and its future. For me, I do not use Facebook (I think I have an account, but I never use it). Having created my own sites, I find the whole software-as-a-service delivery just a bit lacking in any soul. I, too, have major concerns about Facebook tracking my every move, in the same way that Google does. While I appreciate the service that Google provides me (maps and search), I do not think it is worth it for Facebook. Along with this, as a researcher, Facebook perhaps does not give a great platform for sharing research work.
But, what is the future of WhatsApp? I feel secure and protected with it, but now that Facebook owns it, will they move it to integrate tracking and the mining of messages? Will they break the end-to-end tunnel? And, if they aren’t doing these things, why did they buy it in the first place?
The Increasing Dominance of Silicon Valley
Do you worry that our digital world is now being run by run a few companies? Google owns YouTube and Adsense, Facebook owns WhatsApp, Instagram and Microsoft owns LinkedIn and GitHub. And so when companies start to merge the companies that they own into a shared platform, some people start to worry about anti-trust practices.
And so it has happened… Facebook now wants to rewrite the code around WhatsApp and integrate it with Instagram and Facebook Messenger. This is article is based on speculation, and there is currently no details of the methods that Facebook in integrated the packages. What is known is that Facebook bought WhatsApp for its customer base, but is stuck with end-to-end encryption and where it cannot mine user interactions.
It ends up with a Catch-22 situation. Users like the lack of adverts, but Facebook perhaps needs them in order to gain value from their investment, or at least want to share the meta data on their customers across the companies they own. At this point, there are no details on where in the tunnel that Facebook might punch into (or if it does at all).
The spin is that all three apps will come together onto a single messaging system — which makes sense — but this may give an opportunity for Facebook to rewrite the code and break into the tunnel. At the present time, end-to-end encryption is disabled by default on Facebook Messages — and enabled for each chat — and there is no feature like this on Instagram. The tension in not using proper end-to-end encryption is that Facebook would obviously like to know what people are talking about and thus their interests, along with government agencies being able to tap into communications. And so it could be a battle around user privacy.
Both the founders of WhatsApp and Instagram have left Facebook, with Brian Acton — a co-founder of WhatsApp — being particularly adverse to the business model of Facebook.
How Could WhatsApp be Broken?
A possible implementation is for WhatsApp to become less secure and the other two to improve their security, along with the metadata being shared across the applications. In this way, it would join all their users together, and share both the tunnelled encryption methods (and thus improve the security of Instagram and Facebook Messenger), while supporting possible cross-sharing of user’s data. This would allow Facebook to paint a better picture as to who people actually communicated with, and cluster them for areas of interest.
So let’s say that Bob and Victor are Instagram users, Carol is a WhatsApp user and Alice is a Facebook user. Facebook could then merge their accounts into a common shared infrastructure and where they would have a single ID. Facebook could then be able to map users better across each of the platforms, but not be able to mine their messages:
Currently, WhatsApp does a pure end-to-end tunnel, and where no-one can be in-between the communications, and where the encryption key is directly negotiated between Bob and Alice (using ECDH — Elliptic Curve Diffie-Hellman). A less pure system creates a proxy before the tunnel, and allows the application to mine the messages before and after the tunnel:
The proxy then allows for messages to be mined (and for law enforcement to tap into the communications). The best solution for Facebook — but an unlikely scenario — is for them to start to mine the communications of their users for keywords, and then feed advertisements to them either on the platform or through their other social media channels:
While this would be good for revenue for Facebook, it is unlikely that this will be implemented, as it would decrease user trust in Facebook, and possibly see them losing many users to Telegram. There is no way that Facebook could even consider the following:
In this way, Facebook either keeps a copy of the key that is created for the tunnel, and is able to mine into the conversion, or they create a “Facebook-in-the-middle”, and break the tunnel. These would also allow law enforcement to snoop on the communications. This method would be seen as a core breach of privacy laws and would have little chance of being implemented.
Facebook and Trust
With the Cambridge Analytica scandal, trust in Facebook is at an all-time low. Last week, as part of a lecture, I asked students the level of trust they have in cloud service providers, and with a rating from 1 (no trust) to 10 (full trust). We can see that Apple and Google were the most trusted, with Facebook and Twitter trailing well behind:
This backs-up nearly every poll we have ever created around trust in these companies, and where Facebook is always the least trusted company in gathering data and in providing federated identity checking.
Facebook and GDPR
In 2017, a court ruled that Facebook could not transfer any WhatsApp data from its German users. The case started in Germany after new T&Cs were pushed to users (August 2016), and where the Hamburg-based data-protection commissioner Johannes Caspar demanded that Facebook stop transferring WhatsApp data from German users to Facebook. He then defined that they should delete all the data sourced from German citizens that they had already gathered.
His argument was that users had not given their genuine consent for this data being harvested. After the ruling, Facebook cannot, at the present time, transfer any data from WhatsApp for its more than 35 million German users.
As a result, Facebook paused the transferring of WhatsApp data to Facebook across Europe and is speaking with European regulators. But while Facebook has paused the data transfer, Casper wanted the immediate deletion of all the data gathered from WhatsApp for German citizens.
Do you trust Facebook? We need to increasingly build a world which respects the rights to privacy and consent, and Facebook is perhaps not the platform to start with.