GDPR, data security and the law of unintended consequences

For the most part, well-drafted laws do and should not suffer from unintended consequences. But occasionally they can have side-effects that legislators would struggle to have foreseen.

The GDPR is generally acknowledged by right-thinking guardians of privacy to be a good law that is having a material positive impact on the protection of personal data and the systems and processes connected with ensuring privacy. Should it now be changed to cover the following scenario?

Over the summer, The Register published a typical Register piece headlined Talk about unintended consequences: GDPR is an identity thief’s dream ticket to Europeans’ dataBefore and since then, I have encountered many client organisations, including technology and outsourcing providers, that are struggling with the administrative load caused by data subject access requests (DSARs), as well as the need to create and implement technologies, processes and internal rules to enable them to respond appropriately to the deluge of requests. With the summer and our holidays now a distant memory, it seems the right time to remind ourselves of this sorry story and the issue.

For reasons that will be clearer from this article, James Pavur, an Oxford University doctoral student who apparently “usually specialises in satellite hacking” (I wonder what his thesis is about?) agreed with his fiancée that he would make very many DSAR requests in her name. Which he duly did.

And he was able to obtain her credit card details, social security number, passwords, and his prospective mother-in-law’s maiden name. Over two months, Pavur made 150 DSARs in his fiancée’s name.  Overall, 72% of the companies approached responded – it seems with all the information requested – while 83 companies disclosed that they held information about her. 24% of the companies Pavur approached accepted no more than an email address and phone number as proof of identity before sending over the data requested. Another 16% requested ID information, which it seems Pavur could easily have forged. Obviously, he didn’t need to, as his fiancée was complicit in this scam.

Pavur lectured on his experiences at a recent Black Hat security conference in Las Vegas.  The Register quotes him as saying: “Privacy laws, like any other infosecurity control, have exploitable vulnerabilities.  If we’d look at these vulnerabilities before the law was enacted, we could pick up on them”.

The article further quotes Pavur as opining that this issue should be fixed by legislators and, obviously, data controllers and processors. He recommends that lawmakers should set ID standards for DSAR requests.

Amongst the expected side-effects of GDPR must surely have been the exponential growth of DSARs. This tale discloses, on the part of companies responding to Pavur’s DSAR requests, breathtaking naivety as well as recklessness or negligence leading to breaches of the GDPR that it was designed to protect against. How far should EU legislators have taken such naivety, recklessness or negligence into account?  Do EU lawmakers need to impose ID standards for DSAR requests, or simply remind data controllers and processors of their obligations and liabilities in disclosing personal data?

I encourage you to read this short article to reach your own conclusions.

But for me, the more serious and important point here is that, as advisers, we must remind our clients of this: in responding to DSARs they must always have regard to and apply basic safe practices in identifying those who request personal information held (ostensibly) about themselves, and certainly before disclosing personal information to them.

One thought on “GDPR, data security and the law of unintended consequences

Leave a Reply

Your email address will not be published. Required fields are marked *

Share this post on: