The Limitations of the EU's GDPR

|☕️ 4 min read

gdpr falls short of serving as a standard for algorithmic accountability because of its fraud clause, lack of consent guidelines, and allowance of complex data chains.

The problem of algorithmic accountability calls for justification in algorithmic decisions [1]. The General Data Protection Regulation (GDPR), effective within the EU since 2018, attempts to achieve this through its Article 22, which highlights the data subject’s right to informed consent on data handling [2]. GDPR falls short of serving as a standard for algorithmic accountability because of its fraud clause, lack of consent guidelines, and allowance of complex data chains.

First, Article 22 does not apply if the decision “is necessary for entering into, or performance of, a contract between the data subject and a controller” [2]. In the case of fraud prevention, the user need not consent to profiling in order for the company to use individual data to help identify malicious actors on the platform [3]. “Thousands of ‘non-traditional’ third party data sources” [4] are added to data pipelines fed to machine learning algorithms that assess the risk posed by users. This type of automated fraud detection [5] has contributed to the phenomenon of users being kicked off such platforms [6] without recourse for disputing the allegations.

Second, Article 22 does not offer guidelines on what user consent should look like. Google has asked ad publishers to handle the user consent process on behalf of Google [7]. This will result in more user agreements, from different companies, seen by the user. If the data subject is offered misleading or convoluted user agreements, it becomes more difficult for third-party regulatory groups to assess the handling of personal data. There exists an intentional gap in the technical literacy needed by the user to interpret the full data pipeline they are consenting to [8]. Automated decision-making systems incentivize non transparent consent as companies benefit from more user data.

Third, GDPR is targeted towards companies offering goods and services. Even in the case that an organization legally collects and processes data on EU citizens, the same personal data can be sold to another company for a purpose unrelated to offering goods and services [9]. The Facebook-Cambridge Analytica scandal is an example of one such harmful data chain [10]. Article 22 does not provide users with a mechanism to illuminate data chains that start with companies targeting EU citizens, such that algorithmic decisions affecting EU citizens can ultimately be made in the dark.

• • •

Counterarguments to the above may be that if companies disclose indicators of fraud, clarify user agreements, or do not leverage data chains they leave themselves vulnerable to exploitation and produce a lesser service. However, public scrutiny is necessary to prevent opaque, self-reinforcing decisions that reflect the latent discriminations embedded in society [11]. Future versions of GDPR should better support algorithmic accountability by more fully requiring informed user consent around the handling of user data.

Questions/Feedback? Email me at


  1. Binns, R., 2017. Algorithmic accountability and public reason. Philosophy & Technology, pp.1–14.

  2. Parliament and Council of the European Union (2016). General Data Protection Regulation.

  3. Burt, A. (2018, May 16). How will the GDPR impact machine learning? Retrieved February 18, 2019, from

  4. Robinson, D., Yu, H., and Rieke, A. (2014). Civil Rights, Big Data, and Our Algorithmic Future. Social Justice and Technology.

  5. Detecting and Mitigating Fraud in Realtime at Airbnb, Eric Levine — CodeConf 2015. (2017, May 26). Retrieved February 28, 2019, from

  6. Kampen, K., & Kampen, K. (2015, August 20). AirBNB why did you terminate my account? — An Open Letter to AirBNB. Retrieved February 18, 2019, from

  7. To comply with GDPR, Google asks publishers to manage user-data consent for ad targeting in EU. (2018, March 26). Retrieved February 18, 2019, from

  8. Youmans, W.L. and York, J.C., 2012. Social media and the activist toolkit: User agreements, corporate interests, and the information infrastructure of modern social movements. Journal of Communication, 62(2), pp.315–329.

  9. Loopholes in the General Data Protection Regulation. (2018, June 4). Retrieved February 28, 2019, from

  10. Cadwalladr, C. and Graham-Harrison, E., 2018. The Cambridge analytica files. The Guardian, 21, pp.6–7.

  11. O’Neil, C., 2017. Weapons of math destruction: How big data increases inequality and threatens democracy. Broadway Books.

More Like This

Automating Online Hate Speech Detection

Why I Built An App

Visualizing Federal Trade Commission Cases