Union law on the protection of personal data, privacy and the confidentiality of communications applies to personal data processes in connection with the rights and obligations laid down in this Regulation. This Regulation shall not affect Regulations (EU) 2016/679 and (EU) 2018/1725 and Directives 2002/58/EC and (EU) 2016/680, without prejudice to arrangements provided for in Article 10(5) and Article 54 of this Regulation. (Article 2)
This provision indicates that the AI Act recognizes and considers existing personal data protection laws, such as GDPR. But the AI Act does not affect these laws, suggesting there might be additional considerations or requirements for the AI Act beyond GDPR compliance.
High-risk AI systems which make use of techniques involving the training of models with data shall be developed on the basis of training, validation and testing data sets that meet the quality criteria referred to in paragraphs 2 to 5 as far as this is technically feasible according to the specific market segment or scope of application. (Article 10)
This section of the AI Act puts forward specific requirements for high-risk AI systems, especially related to data handling and model training. It implies that general GDPR compliance might not be sufficient to meet all the requirements under the AI Act.
To the extent that it is strictly necessary for the purposes of ensuring negative bias detection and correction in relation to the high-risk AI systems, the providers of such systems may exceptionally process special categories of personal data referred to in Article 9(1) of Regulation (EU) 2016/679, Article 10 of Directive (EU) 2016/680 and Article 10(1) of Regulation (EU) 2018/1725, subject to appropriate safeguards for the fundamental rights and freedoms of natural persons, including technical limitations on the re-use and use of state-of-the-art security and privacy-preserving. (Article 10)
This part of Article 10 increases the obligations for processing special categories of personal data under certain conditions. Compliance with these requirements goes beyond the scope of the GDPR.
The fundamental right to the protection of personal data is safeguarded in particular by Regulations (EU) 2016/679 and (EU) 2018/1725 and Directive 2016/680. Directive 2002/58/EC additionally protects private life and the confidentiality of communications, including providing conditions for any personal and non-personal data storing in and access from terminal equipment. (Recital 2b)
This part of the AI Act indicates that it does not intend to replace existing data protection regulations. But it does not confirm if GDPR compliance alone can fulfill all requirements under the AI Act.
In the AI regulatory sandbox personal data lawfully collected for other purposes may be processed solely for the purposes of developing and testing certain AI systems in the sandbox when all of the following conditions are met… (Article 54)
This article describes additional safeguards and obligations, such as specific requirements in a sandbox environment. While these requirements intersect with GDPR, AI Act seems to introduce additional measures beyond the GDPR.
(aa) the nature of data likely or intended to be processed by the system and, in the case of personal data, the categories of natural persons and groups likely or intended to be affected; (Annex IV)
The AI Act requires an understanding of the categories and nature of data processed by an AI system. This obligation is not directly covered by the GDPR.
- A summary of the data protection impact assessment carried out in accordance with Article 35 of Regulation (EU) 2016/679 or Article 27 of Directive (EU) 2016/680 as specified in paragraph 6 of Article 29 of this Regulation, where applicable. (Annex VIII, Section B)
While the GDPR requires data protection impact assessments, the AI Act requires a summary of these assessments for high-risk AI systems. This implies additional obligations beyond the original GDPR requirements.
In conclusion, while GDPR compliance is an important requirement for handling personal data for AI systems under the AI Act, additional requirements exist that need to be met. The AI Act establishes unique rules, particularly for high-risk AI systems. As such, general GDPR compliance is not enough to fulfill all obligations under the AI Act. Entities that fall under the AI Act must ensure adherence to provisions from both GDPR and the AI Act.
Union law on the protection of personal data, privacy and the confidentiality of communications applies to personal data processes in connection with the rights and obligations laid down in this Regulation. This Regulation shall not affect Regulations (EU) 2016/679 and (EU) 2018/1725 and Directives 2002/58/EC and (EU) 2016/680, without prejudice to arrangements provided for in Article 10(5) and Article 54 of this Regulation. (Article 2)
Under this article, the General Data Protection Regulation (GDPR) is specifically referenced, affirming that the AI Act does not supersed Ruess it. Instead, the AI Act supplements the GDPR, meaning that compliance with GDPR is not enough to fulfill all requirements of the AI Act.
Training datasets, and where they are used, validation and testing datasets, including the labels, shall be relevant, sufficiently representative, appropriately vetted for errors and be as complete as possible in view of the intended purpose. They shall have the appropriate statistical properties, including, where applicable, as regards the persons or groups of persons in relation to whom the high-risk AI system is intended to be used. (Article 10)
Article 10 of the AI Act expands upon the general concepts of GDPR like data minimization, accuracy, and relevance by specifying more detailed duties for high-risk AI systems. Thus it’s clear in this context that GDPR compliance alone wouldn’t meet these AI-specific data governance requirements.
Providers of high-risk AI systems shall ensure that their high-risk AI systems are compliant with the requirements set out in Chapter 2 of this Title before placing them on the market or putting them into service; ensure that natural persons to whom human oversight of high-risk AI systems is assigned are specifically made aware of the risk of automation or confirmation bias; ensure that the high-risk AI system undergoes the relevant conformity assessment procedure, prior to its placing on the market or putting into service, in accordance with Article 43; comply with the registration obligations referred to in Article 51; (Article 16)
Article 16 outlines that providers of high-risk AI systems have responsibilities that extend beyond the scope of the GDPR. If a provider solely complies with GDPR, they would not automatically meet the obligations in this Article, demonstrating again that GDPR compliance isn’t sufficient to fulfill all obligations under the AI Act.
In conclusion, while a solid GDPR compliance strategy is a crucial foundation for achieving compliance with the AI Act, the AI Act imposes additional obligations specific to AI systems that would not be covered solely by adherence to the GDPR. Consequently, general GDPR compliance is not enough to fulfill obligations under the AI Act.
Union law on the protection of personal data, privacy and the confidentiality of communications applies to personal data processes in connection with the rights and obligations laid down in this Regulation. This Regulation shall not affect Regulations (EU) 2016/679 and (EU) 2018/1725 and Directives 2002/58/EC and (EU) 2016/680, without prejudice to arrangements provided for in Article 10(5) and Article 54 of this Regulation. (Article 2)
The mentioned portion from Article 2 highlights that while the AI Act regulates Artificial Intelligence Systems, GDPR compliance (Regulation EU 2016/679) is expected as it applies to the processing of personal data. This suggests that a strong alignment with GDPR principles remains obligatory. Yet, it also implies that AI Act carries additional obligations that aren’t covered by GDPR.
This Regulation lays down: (a) harmonised rules for the placing on the market, the putting into service and the use of artificial intelligence systems (‘AI systems’) in the Union; (b) prohibitions of certain artificial intelligence practices; (c) specific requirements for high-risk AI systems and obligations for operators of such systems; (d) harmonised transparency rules for certain AI systems; (e) rules on market monitoring, market surveillance governance and enforcement; (Article 1)
The mention in Article 1 shows that AI Act provides a set of specific rules and guidelines on the usage, marketing, and service of AI Systems. It has prohibitions on certain AI practices and lays down distinct requirements for high-risk AI systems. It also establishes harmonized transparency rules for certain AI systems and rules for market monitoring and surveillance. This is indicative of responsibilities and obligations that go beyond what is required by GDPR.
The fundamental right to the protection of personal data is safeguarded in particular by Regulations (EU) 2016/679 and (EU) 2018/1725 and Directive 2016/680. (Recital 2b)
This part of the recital makes it clear that the protection of personal data within the context of AI is already addressed by existing general GDPR regulations, including Regulation (EU) 2016/679 (“General Data Protection Regulation” or “GDPR”), Regulation (EU) 2018/1725, and Directive 2016/680.
This Regulation does not seek to affect the application of existing Union law governing the processing of personal data, including the tasks and powers of the independent supervisory authorities competent to monitor compliance with those instruments. (Recital 2b)
In this part, the recital implies that compliance with the AI Act does not interfere with the existing GDPR obligations. It suggests that both AI Act and GDPR will run in parallel, and adhering to GDPR alone does not automatically translate to compliance under the AI Act.
A general description of the AI system including: (aa) the nature of data likely or intended to be processed by the system and, in the case of personal data, the categories of natural persons and groups likely or intended to be affected; (Annex IV, point 1(aa))
This point implies that the AI Act has specific requirements on documenting the types of data an AI system is going to process, including data categories and data subjects if the data is personal. This goes beyond the GDPR in detailing how the data is used rather than merely handled.
where relevant, the data requirements in terms of datasheets describing the training methodologies and techniques and the training data sets used, including information about the provenance of those data sets, their scope and main characteristics; how the data was obtained and selected; labelling procedures (e.g. for supervised learning), data cleaning methodologies (e.g. outliers detection); (Annex IV, point 2(d))
This point addresses the algorithm development process, specifying that information about data used for training the AI, including its origins, selection, and preparation procedures, needs to be documented. This falls outside the purview of general GDPR compliance as GDPR primarily pertains to the protection and privacy of personal data and not to the details of the utilization or application of such data.
In conclusion, while GDPR compliance is necessary, it does not cover the entirety of the obligations under the AI Act due to the specific rules and standards that govern the deployment and usage of AI systems. The AI Act therefore creates additional layers of obligations that must be complied with in conjunction with GDPR.
The purpose of this Regulation is to promote the uptake of human-centric and trustworthy artificial intelligence and to ensure a high level of protection of health, safety, fundamental rights, democracy and the rule of law, and the environment from harmful effects of artificial intelligence systems in the Union while supporting innovation; (Article 1)
This statement from Article 1 gives us a clear understanding that the AI Act aims for more than just data protection. It envisions a comprehensive coverage of various aspects such as health, safety, fundamentals rights, and the environment. This implies that being compliant with GDPR alone might not fulfill all necessities of the AI Act.
Union law on the protection of personal data, privacy and the confidentiality of communications applies to personal data processes in connection with the rights and obligations laid down in this Regulation. This Regulation shall not affect Regulations (EU) 2016/679 and (EU) 2018/1725 and Directives 2002/58/EC and (EU) 2016/680, without prejudice to arrangements provided for in Article 10(5) and Article 54 of this Regulation. (Article 2)
Article 2 reinforces the point that while the AI Act does encompass GDPR regulations concerning data protection, it doesn’t restrict itself only to those provisions. This suggests that entities compliant with GDPR would also have to look beyond and fulfill obligations under the AI Act to meet its broad realm of demands.
High-risk AI systems shall be designed and developed following the principle of security by design and by default. In the light of their intended purpose, they should achieve an appropriate level of accuracy, robustness, safety, and cybersecurity, and perform consistently in those respects throughout their lifecycle. (Article 15)
By emphasizing the robustness, safety, accuracy, and cybersecurity requirements for high-risk AI systems, Article 15 points out areas of focus that are not typically covered by GDPR guidelines. This further illustrates that general GDPR compliance does not cover the entirety of obligations outlined in the AI Act.
Prior to putting a high-risk AI system as defined in Article 6(2) into use, deployers shall conduct an assessment of the systems’ impact in the specific context of use. (Article 29a)
Article 29a introduces an obligation to perform a detailed assessment of a high-risk AI system’s impact before its deployment. This goes beyond the purview of GDPR and signifies that there might be additional requirements under the AI Act that are not typically addressed by GDPR compliance.
This Regulation does not seek to affect the application of existing Union law governing the processing of personal data, including the tasks and powers of the independent supervisory authorities competent to monitor compliance with those instruments. (Recital 2b)
Recital 2b demonstrates that the AI Act doesn’t intend to substitute or overpower existing data protection laws such as the GDPR. Instead, it seeks to work together with these rules, which implies that compliance with GDPR alone might not completely fulfill all obligations under the AI Act.
This Regulation is also without prejudice to the rules laid down by other Union legal acts related to consumer protection and product safety (Recital 2c)
Through Recital 2c, the AI Act intends to coexist with other laws relating to consumer protection and product safety, not replace or undermine them. This further signals that obligations under the AI Act go above and beyond general GDPR regulations.
(aa) the nature of data likely or intended to be processed by the system and, in the case of personal data, the categories of natural persons and groups likely or intended to be affected; (Annex IV)
Annex IV indicates obligations concerning the nature of data processed by the AI system and the impact it could have on certain individuals or groups. Although there is a reference to personal data, the intention goes beyond mere protection of this data, necessitating detailed documentation that is not typically demanded under GDPR.
(d) where relevant, the data requirements in terms of datasheets describing the training methodologies and techniques and the training data sets used, including information about the provenance of those data sets, their scope and main characteristics; how the data was obtained and selected; labelling procedures (e.g. for supervised learning), data cleaning methodologies (e.g. outliers detection); (Annex IV)
With Annex IV’s emphasis on detailing the origin, nature, selection, and processing of data, it’s clear that these obligations surpass typical GDPR data protection requirements. It reiterates how compliance with GDPR alone isn’t sufficient in the context of the AI Act, which mandates broader and specific responsibilities pertaining to AI systems.
In conclusion, while general GDPR compliance forms a crucial part of AI regulation, it does not encompass all obligations under the AI Act. Compliance with the AI Act requires additional measures and documentations extending beyond the GDPR, especially when it comes to high-risk AI systems and their unique requirements.