With GDPR's 'Right of Access,' Who Really Has Access?
How a security researcher learned organizations willingly hand over sensitive data with little to no identity verification.
The European Union's General Data Protection Regulation (GDPR) has a provision called "Right of Access," which states individuals have a right to access their personal data. What happens when companies holding this data don't properly verify identities before handing it over?
This became the crux of a case study by James Pavur, DPhil student at Oxford University, who sought to determine how organizations handle requests for highly sensitive information under the Right of Access. To do this, he used GDPR Subject Access Requests to obtain as much data as possible about his fiancée – with her permission, of course – from more than 150 companies.
Shortly after GDPR went into effect last May, Pavur became curious about how social engineers might be able to exploit the Right of Access. "It seemed companies were in a panic over how to implement GDPR," he explains. Out of curiosity he sent a few Subject Access Requests, which individuals can make verbally or in writing to ask for access to their information under GDPR.
In these early requests Pavur only asked for his own data from about 20 companies. He found many didn't ask for sufficient ID before giving it away. Many asked for extensions – GDPR allows 60 days – before sending it because they didn't have processes in place to handle requests. The initial survey took place in fall of 2018, when GDPR was just getting into full swing, Pavur says.
Phase two came in January, when he decided to do a broader experiment requesting his fiancée's information. Over three to four months, Pavur submitted requests to businesses across different sizes and industries to obtain a range of sensitive data, from typical sensitive information like addresses and credit card numbers, to more esoteric data like travel itineraries.
He went into the experiment with three types of data: his fiancée's full name, an old phone number of hers he found online, and a generic email address ([email protected]). All of these, he notes, are things social engineers could easily find. "The threshold for starting the attack was very low," he says. "Every success increases the credibility of your results in the future." Pavur requested her personal data using these initial pieces of information; as companies responded with things he asked for, he could tailor future requests to be better.
"I tried to pretend like I didn't know much about my fiancée," he continues. "I tried to make it as realistic as possible … tried to not allow my knowledge about her to bias me."
Compared to the early stages of his experiment, Pavur found when he requested his fiancée's information, businesses were better at handling the process. Still, the responses were varied, and there wasn't a consistent way of responding to Subject Access Requests, he says.
"I sort of expected that companies would try to verify the identity by using something they already know," he says. For example, he thought they might only accept an email address linked to a registered account. "I thought that was the best mechanism for verifying accounts."
More than 20 out of 150 companies revealed some sort of sensitive information, he found. Pavur was able to get biographical information, passport number, a history of hotels she stayed at; he was also able to verify whether she had accounts with certain businesses, he notes. The means of verifying his fiancée's identity varied by industry: retail companies asked what her last purchase was; travel companies and airlines asked for passport information.
Interestingly, some companies started out strong with requests for identity verification, then caved when Pavur said he didn't feel comfortable providing it. One company asked for a passport number to verify identity; when he refused, they accepted a postmarked envelope. Some businesses improved their verification over time, he adds, but mistakes are still being made: a handful of organizations accidentally deleted his fiancée's account when asked for data. He points to a need for businesses to feel comfortable denying suspicious GDPR requests.
Pavur will be presenting the details of his case study this August at Black Hat USA in a presentation "GDPArrrrr: Using Privacy Laws to Steal Identities."
Related Content:
Read more about:
Black Hat NewsAbout the Author
You May Also Like