Automated Individual Decision-Making And Profiling
ARTICLE 29 DATA PROTECTION WORKING PARTY
3 October 2017 17/EN WP 251
The General Data Protection Regulation (the GDPR), specifically addresses profiling and automated individual decision-making, including profiling.
Profiling and automated decision-making are used in an increasing number of sectors, both private and public. Banking and finance, healthcare, taxation, insurance, marketing and advertising are just a few examples of the fields where profiling is being carried out more regularly to aid decision-making.
Advances in technology and the capabilities of big data analytics, artificial intelligence and machine learning have made it easier to create profiles and make automated decisions with the potential to significantly impact individuals’ rights and freedoms.
The widespread availability of personal data on the internet and from Internet of Things (IoT) devices, and the ability to find correlations and create links, can allow aspects of an individual’s personality or behaviour, interests and habits to be determined, analysed and predicted.
Profiling and automated decision-making can be useful for individuals and organisations as well as for the economy and society as a whole, delivering benefits such as: increased efficiencies; and resource savings.
They have many commercial applications, for example, they can be used to better segment markets and tailor services and products to align with individual needs. Medicine, education, healthcare and transportation can also all benefit from these processes.
However, profiling and automated decision-making can pose significant risks for individuals’ rights and freedoms which require appropriate safeguards.
These processes can be opaque. Individuals might not know that they are being profiled or understand what is involved.
Profiling can perpetuate existing stereotypes and social segregation. It can also lock a person into a specific category and restrict them to their suggested preferences. This can undermine their freedom to choose, for example, certain products or services such as books, music or newsfeeds. It can lead to inaccurate predictions, denial of services and goods and unjustified discrimination in some cases.
The GDPR introduces new provisions to address the risks arising from profiling and automated decision-making, notably, but not limited to, privacy. The purpose of these guidelines is to clarify those provisions.
This document covers:
-Definitions of profiling and automated decision-making and the GDPR approach to these in general – Chapter II
-Specific provisions on automated decision-making as defined in Article 22 – Chapter III
-General provisions on profiling and automated decision-making – Chapter IV
-Children and profiling – Chapter V
-Data protection impact assessments – Chapter VI
The Annexes provide best practice recommendations, building on the experience gained in EU Member States.
You can reach the guidelines from the link below: