Intricacies of elaboration – data protection impact assesment

Constantin Crețu
Constantin Crețu

Advent of new technologies and the increasing complexity of processing, operators must address such risks by examining the likely impact of the expected processing before starting the processing operation. This allows organizations to properly identify, address and mitigate risks in advance, significantly limiting the likelihood of a negative impact on individuals as a result of processing.

In accordance with the General Data Protection Regulation (GDPR) and the applicable ICO Guidelines, DPIAs are required in a number of high-risk data processing situations, including:

– Implementation of innovative technology;
– Biometric data processing;
– Processing of any special category of personal data on a large scale; or
– Any automatic or profiling decision-making, which significantly affects that person, for example to offer or deny access to a service to an individual.

The European concept

The RGPD does not formally define the DPIA as such, but
– its minimum content is specified in art. 35 (7) we also have this in law no. 133 of 2011 of the Republic of Moldova, as follows:

a) A systematic description of the intended processing operations and the purposes of the processing, including, where appropriate, the legitimate interest pursued by the controller;
b) An assessment of the necessity and proportionality of the processing operations in relation to these purposes;
c) An assessment of the risks to the rights and freedoms of the data subjects referred to in para. (1); and
d) The measures envisaged to address the risks, including safeguards, security measures and mechanisms to ensure the protection of personal data and to demonstrate compliance with the provisions of this Regulation, taking into account the legitimate rights and interests of data subjects and other interested parties. ” As regards its meaning and role, it is clarified in Recital 84[10] as follows: ‘In order to promote compliance with the provisions of this Regulation where be responsible for conducting a data protection impact assessment to assess, in particular, the origin, nature, specificity and severity of this risk. “.

The English concept

In particular, the ICO confirmed that the DPIA should:

– To be completed or updated before each implementation of the technology;
– Demonstrate clearly and comprehensively why technology is strictly necessary and why a less intrusive option is not available;
– Include a clear assessment of the likelihood that the objectives of using the technology will be met and how effectiveness can be measured;
– Explain how effective mitigation measures have been implemented; and
– Be subject to continuous review, including any changes in the circumstances regarding the processing of the data or the nature of the risks that apply.

Thus we obtain the following order of ideas:

1. RGPD + Law no. 133 of 2011 contains in itself the “minimum” criteria to carry out the assessment of the impact on personal data.

2. Subsequently, Article 29 on Working Party Guidance on Data Protection Officer 16 / EN WP 243 states that in the case of certain high-risk processing operations, the controller should be responsible for performing a data protection operation. impact assessments on the protection of personal data to assess, in particular, the origin, nature, specificity and severity of this risk.

3. We notice that these notions mentioned above have a fairly general character, and what is not forbidden – is allowed. Respectively, this allows us to make a broader assessment of personal data protection.

4. A personal data breach may, if properly and in a timely manner, not result in physical, material or immaterial harm to individuals, such as loss of control over their personal data. or limitation of their rights, discrimination, theft of identity or fraud, financial loss, unauthorized reversal of pseudonymy, damage to reputation, loss of confidentiality of personal data protected by professional secrecy or any other significant economic or social disadvantage to the individual concerned. Therefore, as soon as the controller becomes aware that a personal data breach has occurred, the controller should notify the supervisory authority of the personal data breach without undue delay and, where possible, at the latest 72 hours after becoming aware of it, unless the controller is able to demonstrate, in accordance with the principle of liability, that the breach of personal data is unlikely to pose a risk to the rights and freedoms of individuals.

5. The DPIA according to the RGPD is a risk management tool for the rights of data subjects and therefore presents their perspective, as in certain areas (eg company security). Instead, risk management in other areas (eg information security) focuses on the organization.

RGPD gives data operators the flexibility to determine the exact structure and shape of the DPIA, to allow it to fit existing work practices. There are a number of different processes in place in the EU and around the world that take into account the components described in Recital 90. However, regardless of its form, the DPIA must be a real risk assessment, enabling operators to take action to mitigate them.

6. In other words, the more complex the technology or system based on compliance with the RGPD principles and other international acts, the more impact assessment operations will be performed. At this point 9 criteria must be considered [14]:

1. Evaluation or scoring;
2. Automated decision – making process with significant legal or similar effects;
3. Systematic monitoring;
4. Sensitive data or data of a very personal nature;
5. Widely processed data;
6. Matching or combining data sets;
7. Data on vulnerable data subjects;
8. Innovative use or implementation of new technological or organizational solutions;
9. Where the processing itself ‘prevents natural persons from exercising a right or using a service or contract;

7. A data protection impact assessment should also be carried out where personal data are processed for the decision-making of certain individuals following any systematic and extensive assessment of personal matters relating to individuals on the Internet. the basis for profiling these data or following the processing of special categories of personal data; biometric data or data on criminal convictions and offenses or related security measures.

8. A data protection impact assessment is also required for the monitoring of widely accessible areas, in particular when using opto – electronic devices or for any other operation in which the competent supervisory authority considers that the processing is susceptible. to pose a high risk to the rights and freedoms of data subjects, in particular because they prevent data subjects from exercising their right or use of a service or contract, or because they are systematically carried out on a large scale.

9. There are circumstances in which it may be reasonable and economical for the subject of a data protection impact assessment to be broader than a single project, for example, where public authorities or bodies intend to establish a common application or processing platform; or when several operators intend to introduce a common application or processing environment in a sector or industry segment or for a widely used horizontal activity.

10. If a data protection impact assessment indicates that in the absence of safeguards, security measures and risk mitigation mechanisms, the processing would result in a high risk to the rights and freedoms of individuals, and the controller is In the opinion that the risk cannot be mitigated by reasonable means in terms of available technologies and implementation costs, the supervisory authority should be consulted before starting the processing activities. Such a risk is likely to result from certain types of processing and the extent and frequency of processing, which may also lead to harm or interference with the rights and freedoms of the individual.

11. When a DPIA is carried out at the stage of drafting legislation providing a legal basis for processing, it is likely that a review will be needed before the start of operations, as the legislation adopted may differ from the proposal in that it affects privacy and data protection. In addition, there may not be enough technical details on the actual processing at the time of the adoption of the legislation, even if it is accompanied by a DPIA. In such cases, it may be necessary to perform a specific DPIA before carrying out the actual processing activities.

12. Innovative use or implementation of new technological or organizational solutions such as combining the use of fingerprint with facial recognition to improve physical access control, etc. The RGPD clarifies (Article 35 (1) and Recitals 89 and 91) that the use of a new technology, defined in accordance with the “achieved level of technological knowledge” (Recital 91), may trigger the need for a DPIA. This is because the use of such technology may involve new forms of data collection and use, possibly with a high risk to the rights and freedoms of individuals. Indeed, the personal and social consequences of developing a new technology may be unknown. A DPIA will help the operator to understand and address such risks. For example, certain Internet of Things applications could have a significant impact on the daily lives and privacy of individuals; and therefore requires a DPIA.

13. A systematic description of the processing is provided (Article 35(7)(a)[1][2]:
– nature, scope, context and purposes of the processing are taken into account (recital 90);
– personal data, recipients and period for which the personal data will be stored are recorded;
– a functional description of the processing operation is provided;
– the assets on which personal data rely (hardware, software, networks, people, paper or paper transmission channels) are identified;
– compliance with approved codes of conduct is taken into account
(Article 35(8)).

14. Necessity and proportionality are assessed (Article 35(7)(b)):
– controls envisaged to comply with the Regulation are determined (Article 35(7)(d) and recital 90), taking into account:
– controls contributing to the proportionality and the necessity of the processing on the basis of:
– specified, explicit and legitimate purpose(s) (Article 5(1)(b));
– lawfulness of processing (Article 6);
– adequate, relevant and limited to what is necessary data (Article 5(1)(c));
– limited storage duration (Article 5(1)(e));
– controls contributing to the rights of the data subjects:
– information provided to the data subject (Articles 12, 13 and 14);
– rights of access and portability (Articles 15 and 20);
– rights to rectify and erase (Articles 16 and 17);
– rights to object and restriction of processing (Articles 16 and 21);
– processors (Article 28);
– safeguards surrounding international transfers (Chapter V).

15. Risks to the rights and freedoms of data subjects are managed (Article 35(7)(c)):
– origin, nature, particularity and severity of the risks are assessed (see recital 84) or, more specifically, for each risk (illegitimate access, unwanted change and disappearance of data), from the perspective of the data subjects:
– risk sources are taken into account (recital 90);
– potential impacts to the rights and freedoms of data subjects are identified in case of illegitimate access, unwanted change and disappearance of data;
– threats that could lead to illegitimate access, unwanted change and disappearance of data are identified;
– likelihood and severity are estimated (recital 90);
– controls envisaged to address those risks are determined (Article 35(7)(d) and recital 90).

Case Law:

The case of United Kingdom R (Bridges) v Chief Constable of South Wales Police [14] and others [2019] EWHC 2341 [2] (Bridges) reached a similar conclusion. In this case, the use of FRT by police forces at public events to identify persons on police watch lists was considered legal. The basis for this decision was that there was a clear legal framework for the use of such technologies by the police, and the processing of personal data was considered proportionate. Although this decision was largely based on the specific role of the police as a public authority, some of the points discussed by the court are relevant to private sector organizations seeking to use such technologies.

In May 2019, the Danish DPA (“Datatilsynet”) granted the private company Brøndby IF [14] permission to use the FRT to identify individuals. The specific use case was to identify fans who were denied access to a football stadium to ensure that they did not enter the stadium. The main reasoning behind this decision was that the technology provided an “essential public benefit”.

Key points to remember when completing the DPIA

Taking these various cases and enforcement actions together, it would appear that, at this stage, there are four key areas of attention for courts and DPIAs in Europe when examining FRT cases, namely:

– Transparency: Clear notifications on the use of FRT should be made available to data subjects in order to collect and process their personal data. In Bridges, a three-tier approach to posting on social media, displaying large ads on police vehicles, and distributing cards to members of the public was considered sufficient.
– Proportionality: if the goal can be achieved by less invasive means, as considered by the French and Swedish DPAs when such technologies were used to control student access and enrollment, then FRT should not be used.
– Data minimization: unnecessary personal data should not be stored and should be deleted as soon as possible. In the decision of Bridges and the Danish DPA, the immediate deletion of the images of the data subjects, which were not on the relevant “wish list”, was essential for approving the use of the FRT.
– Security: where biometric data is processed, security measures will need to be stricter, including features such as encryption, two-factor authentication and lack of access to data over the Internet.

Bibliography
https://gdpr-info.eu/
https://www.cnil.fr/en/PIA-privacy-impact-assessment-en
https://www.informationpolicycentre.com/uploads/5/7/1/0/57104281/cipl_gdpr_project_risk_white_paper_21_december_2016.pdf
Ghid privind Evaluarea impactului asupra protecției datelor (DPIA) și stabilirea dacă o prelucrare este „susceptibilă să genereze un risc ridicat” în sensul Regulamentului 2016/679
https://juridicemoldova.md/12583/lista-tipurilor-de-operatiuni-de-prelucrare-a-datelor-personale-care-fac-obiectul-dpia-supusa-spre-consultari-publice.html
Guidelines on Data Protection Impact Assessment (DPIA) and determining whether processing is “likely to result in a high risk” for the purposes of Regulation 2016/679
Guidelines on Automated individual decision-making and Profiling for the purposes of Regulation 2016/679
https://www.simmons-simmons.com/en/publications/ck0bcc809o1jf0b85opoqi6l2/271118-french-data-protection-impact-assessment
https://www.huntonprivacyblog.com/2018/10/08/edpb-adopts-opinions-national-dpia-lists-eu/
Risk, High Risk, Risk Assessments and Data Protection Impact Assessments under the GDPR CIPL GDPR Interpretation and Implementation Project 21 December 2016
https://gdpr-info.eu/issues/privacy-impact-assessment/
ISO ► ISO/IEC 29134:2017 – Guidelines for privacy impact assessment: Disponibil la: https://www.iso.org/obp/ui/#iso:std:iso-iec:29134:ed-1:v1:en
IAPP ► A Process for Data Protection Impact Assessment Under the European General Data Protection Regulation Disponibil la: https://iapp.org/media/pdf/resource_center/Springer-DPIA-whitepaper.pdf
Disponibil la: https://www.bristows.com/news/lessons-learned-completing-a-dpia-for-an-ai-use-case/#_ftn3