Privacy by design – is there such a thing as too much?

Karishma Brahmbhatt

Privacy by design has been one of the buzzwords of the General Data Protection Regulation (GDPR), with the Regulation going so far as to say that “in order to be able to demonstrate compliance with this Regulation, the controller should adopt internal policies and implement measures which meet in particular the principles of data protection by design and data protection by default. Such measures could consist, inter alia, of…enabling the controller to create and improve security features” (Recital 78, agreed and re-numbered draft of the GDPR). The concepts of privacy by design and default are by no means new ones and, currently, are indirectly referred to in Directive 95/46/EC (Directive), which states that “the protection of the rights and freedoms of data subjects with regard to the processing of personal data requires that appropriate technical and organisational measures be taken, both at the time of the design of the processing system and at the time of processing itself” (Recital 46, Directive). What is novel in the GDPR is the reinvigorated and express emphasis on privacy by design and default as being fundamental to achieving compliance with European data protection law. These concepts seem to be sensible and, until recently at least, fairly uncontroversial.

Ok, so WhatsApp with privacy by design then?

Well, that depends on your standpoint. WhatsApp has recently announced its launch of end-to-end encryption for all messages, phone calls, photos and videos transmitted on its network, such that not only would it not store the data transmitted across its network, but that that data cannot be read by either WhatsApp or by third parties. European data protection authorities have previously intimated that minimising the processing of personal data is an effective way of implementing privacy by design and enhancing data security, the logic being that the less personal data you process, the smaller the commercial risk of non-compliance with data protection laws.  Indeed, the WP29 Opinion on App Development states that “app design criteria should include the implementation of the principle of the least privilege by default, whereby apps are enabled to access only the data they really need to make a functionality available to the user”. This message has most recently been reiterated in Recital 78 and Article 25 of the agreed and re-numbered draft of the GDPR. So on this basis, WhatsApp’s approach would seem to be wholly consistent with the letter and spirit of European data protection law.

But whilst the GDPR is expected to significantly (although probably not entirely) iron out discrepancies in the application of data protection law across European member states, its effect on bridging the gap between European attitudes to data protection and those of law enforcement and security agencies across the world, is likely to be somewhat less pronounced. The problem with end-to-end encryption, law enforcement authorities argue, is that it creates a type of ‘Dark Web’ where illegal and immoral activities can take place without detection, and so risks the wellbeing, livelihood, and even lives of the mass population. So WhatsApp, and other companies that seek to maximise their privacy by design and default to minimise their data protection liabilities, could find themselves in an Apple-esque situation, whereby law enforcement and security agencies (such as the FBI) insist on the creation of access-points to the data.

Interestingly, whilst the European Union, and law enforcement authorities (both within and outside the EU) purport to share the common objective of securing the rights and freedoms of individuals, their means of achieving this objective are at polar ends of the spectrum – the former advocates minimising the processing of personal data to protect the rights and freedoms of individuals, whilst the latter arguably insists on accessing as much personal data as possible to achieve the same aim. These antagonistic approaches have, naturally, manifested themselves in proposed legislation; on the one hand, we have the GDPR. On the other hand, we have the national trend of introducing investigatory powers legislation, as exemplified by the UK Investigatory Powers Bill, and the recently-leaked anti-encryption bill proposed by Senators Dianne Feinstein and Richard Burr in the US, both of which could broadly require companies to install ‘back-doors’ in software to allow security agencies to access unencrypted data. It’s not a case of asking companies to collect new personal data per se – there is already a torrent of personal data flowing through WhatsApp’s network or, in Apple’s case, stored on iPhones. Instead, it’s a case of asking companies to make pre-existing data newly accessible. And once WhatsApp or Apple creates a tiny peephole in their respective networks or devices to allow governments to monitor communications, what’s to stop uninvited third parties from getting a look in? Arguably, technological back-doors could undo the benefits created by a privacy by design approach (as they would give companies access to more or less the same type and/or volume of data they would have accessed sans privacy by design) and compromise the extent to which companies could claim compliance with data protection laws (purpose limitation and data minimisation aside, could technical and organisational measures really be considered to be ‘adequate’ where they have intentionally been designed with a vulnerability?).

Whilst there is no easy answer to this, WhatsApp’s triumphant end-to-end encryption, together with the now-abandoned Apple v FBI case, only serves to highlight the pressing need for cross-border regulatory dialogue and strategy; commercially, the inconsistency of cross-border regulation risks making data protection compliance a zero-sum game. In the interim, it would be advisable for organisations to continue preparing for the advent of the GDPR and to continue watching this space for test cases and updates as regulators look for practical ways of reconciling the fundamental right to privacy with the right to security.

Comments published on Digital Hub do not necessarily reflect the views of Allen & Overy.

Read comments below or add a comment

Leave a comment

Your email address will not be published. Required fields are marked *