This Regulation applies to: (a) providers placing on the market or putting into service AI systems in the Union, irrespective of whether those providers are established within the Union or in a third country; (b) deployers of AI systems that have their place of establishment or who are located within the Union; (Article 2)
As a shop owner who possibly uses an AI system for analyzing shopper’s data, you fall under the category of ‘deployers’. The EU AI Act considers both providers and deployers of AI systems irrespective of their place of establishment, hence clearly applying to you.
‘deployer’ means any natural or legal person, public authority, agency or other body using an AI system under its authority except where the AI system is used in the course of a personal non-professional activity; (Article 3)
Acting as a deployer of the AI system in your shop aligns with the AI Act’s definition of the term. This means that any regulation for ‘deployers’ in the Act applies to you.
The following artificial intelligence practices shall be prohibited: (a) the placing on the market, putting into service or use of an AI system that deploys subliminal techniques beyond a person’s consciousness or purposefully manipulative or deceptive techniques, with the objective to or the effect of materially distorting a person’s or a group of persons’ behaviour by appreciably impairing the person’s ability to make an informed decision, thereby causing the person to take a decision that that person would not have otherwise taken in a manner that causes or is likely to cause that person, another person or group of persons significant harm; (Article 5)
The Act prohibits certain types of AI practices particularly those intending to manipulate individual behaviors unconsciously or hindering their informed decision-making. If your use of AI in the shop covertly influences consumers’ behaviors or alters their decisions, it could potentially infringe this regulation. To align with this regulation, employ transparency in data collection and analysis and ensure you comply with data protection laws of the EU.
Biometric and biometrics-based systems (a) AI systems intended to be used for biometric identification of natural persons, with the exception of those mentioned in Article 5; (aa) AI systems intended to be used to make inferences about personal characteristics of natural persons on the basis of biometric or biometrics-based data, including emotion recognition systems, with the exception of those mentioned in Article 5; (Annex III 1)
If your shop’s AI techniques involve biometric identification or infer personal characteristics based on biometric data, they will fall under the purview of the Act as per Annex III. However, biometric verification systems merely confirming the identity are not subjected to these regulations.
Access to and enjoyment of essential private services and public services and benefits: (b) AI systems intended to be used to evaluate the creditworthiness of natural persons or establish their credit score, with the exception of AI systems used for the purpose of detecting financial fraud; (Annex III 5)
If your AI system is also used to evaluate creditworthiness or determine credit scores of customers, then it falls within the application of the Act unless it’s used for detecting financial fraud.
Remember, aligning with Article 6(2) of the AI Act is crucial; these regulations apply if your AI system is likely to cause significant harm due to its specific use without sufficient safeguards in place. It’s paramount to ensure your AI application doesn’t result in potentially harmful impacts.
I was not able to find specific information related to Title IV on transparency obligations and Title II on voluntary codes of conduct as per your legal question. However, it’s advisable that AI systems adhere to general principles of transparency and avoid breaching privacy rules.
This Regulation applies to: (a) providers placing on the market or putting into service AI systems in the Union, irrespective of whether those providers are established within the Union or in a third country; (b) deployers of AI systems that have their place of establishment or who are located within the Union; (Article 2)
As a physical shop owner analyzing shopper’s data, the EU AI Act applies to you as you’re deemed a deployer of an AI system in the Union, no matter where you or your business are established.
‘deployer’ means any natural or legal person, public authority, agency or other body using an AI system under its authority except where the AI system is used in the course of a personal non-professional activity; (Article 3)
Stretching on the above point, as a deployer (any individual or company utilizing an AI system under their authority), the Act refers to you. Yet the cogent implications will relay on the type of AI system you’re employing and how it’s used.
The following artificial intelligence practices shall be prohibited: (a) the placing on the market, putting into service or use of an AI system that deploys subliminal techniques beyond a person’s consciousness or purposefully manipulative or deceptive techniques, with the objective to or the effect of materially distorting a person’s or a group of persons’ behaviour by appreciably impairing the person’s ability to make an informed decision, thereby causing the person to take a decision that that person would not have otherwise taken in a manner that causes or is likely to cause that person, another person or group of persons significant harm; (Article 5)
While the AI system can examine shopper data, the EU AI Act restrains the kind of practices you can implement. Article 5, for example, bars the usage of AI systems that involve manipulative means to distort a person’s demeanor or impedes their capacity to make informed resolutions. Therefore, ensure that your system doesn’t manipulate shoppers’ decision-making capability or cause comprehensive harm.
AI systems specifically referred to in under points 1 to 8a stand for critical use cases and are each considered to be high-risk AI systems pursuant to Article 6(2), provided that they fulfil the criteria set out in that Article (Annex III)
The EU AI Act mainly moderates ‘high-risk’ AI systems. Hence, if the AI systems you utilize to analyze shopper data meet the criteria in Article 6(2) and correspond to Annex III, they are deemed high risk, and extra legislative requirements may apply.
AI systems intended to be used for biometric identification of natural persons…; AI systems intended to be used to make inferences about personal characteristics of natural persons on the basis of biometric or biometrics-based data, including emotion recognition systems… (Annex III, Point 1)
If the AI system in use collects biometric data from shoppers, such as emotion recognition, identification, or makes inferences about personal traits, your system falls into high-risk regulation mentioned in point 1 of Annex III.
AI systems intended to be used for recruitment or selection of natural persons, notably for placing targeted job advertisements screening or filtering applications, evaluating candidates in the course of interviews or tests; AI systems intended to be used to make or materially influence decisions affecting the initiation, promotion and termination of work-related contractual relationships, task allocation based on individual behaviour or personal traits or characteristics, or for monitoring and evaluating performance and behavior of persons in such relationships (Annex III, Point 4)
If you’re utilizing AI to recruit staff, manage workers, or for decisions based on individual behaviors, they may fall under point 4 of Annex III, and therefore, the EU AI Act will apply to your operations.
In conclusion, whether the AI Act applies to you and how it does so boil down to how you use AI technologies, particularly if they’re considered high-risk categories outlined in Annex III. While this analysis is based on a partial interpretation of Annex III, if AI is not used as delineated in this Annex, the Act might not cover such uses. Nonetheless, you might want to consider other aspects of the legislation, with potential legal support.
(a) providers placing on the market or putting into service AI systems in the Union, irrespective of whether those providers are established within the Union or in a third country;
(b) deployers of AI systems that have their place of establishment or who are located within the Union;
(4) ‘deployer’ means any natural or legal person, public authority, agency or other body using an AI system under its authority except where the AI system is used in the course of a personal non-professional activity; (Article 2, Article 3)
When considering your physical shop, if you’re using AI systems to analyze customers’ data, you fall under the definition of a ‘deployer’ under the EU AI Act. The Act applies to any natural or legal person using an AI system for professional activities within the Union.
Deployers of high-risk AI systems shall take appropriate technical and organisational measures to ensure they use such systems in accordance with the instructions of use accompanying the systems, pursuant to paragraphs 2 and 5 of this Article.
The obligations in paragraph 1 and 1a, are without prejudice to other deployer obligations under Union or national law and to the deployer’s discretion in organising its own resources and activities for the purpose of implementing the human oversight measures indicated by the provider. (Article 29)
But being a deployer also brings some obligations. This includes taking appropriate technical and organizational measures to ensure that the AI system is used in accordance with the instructions provided. However, let’s look a bit deeper into what these obligations might mean for you. Do they apply regardless of the system’s risk category, or are there any differences based on risk?
Moreover, if you’re using a ‘high-risk AI system’, there would be additional requirements for you to follow. In such a scenario, you would have to ensure that input data is relevant and representative, log the system operations, perform ongoing monitoring, and cooperate with oversight authorities. Determining whether or not your AI system is high-risk, though, requires us to delve further into the AI Act.
AI systems intended to be used for biometric identification of natural persons, with the exception of those mentioned in Article 5; (Annex III, 1a)
AI systems intended to be used to make inferences about personal characteristics of natural persons on the basis of biometric or biometrics-based data, including emotion recognition systems, with the exception of those mentioned in Article 5; (Annex III, 1aa)
If your AI system used in the shop analyzes biometric data of shoppers or makes assumptions about their personal characteristics based on this data, it might be classified as high-risk under the AI Act.
- Operators of the AI systems which are components of the large-scale IT systems established by the legal acts listed in Annex IX that have been placed on the market or put into service prior to … [the date of entry into force of this Regulation] shall take the necessary steps to comply with the requirements laid down in this Regulation by (Article 83)
Finally, even if you’ve already been using the AI system before the AI Act came into force, you’re still required to comply with the rules and regulations within a specified transition period.
In conclusion, given your situation as a physical shop owner using an AI system to analyze shopper’s data, the AI Act does apply to you. To what extent, though, depends primarily on whether your AI system is classified as high-risk. Therefore, a more detailed legal analysis would be recommended.
From the analysis of the EU AI Act, it’s clear that the legislation does apply to you as the owner of a physical shop using AI to analyze shopper’s data, assuming that your store is located within the European Union.
”This Regulation applies to: (b) deployers of AI systems that have their place of establishment or who are located within the Union;” (Article 2)
As you are deploying AI systems within your store in a professional capacity, you are indeed a “deployer” as per the Act. This plan will help you understand how the Act applies to your situation.
”For the purpose of this Regulation, the following definitions apply: (4) ‘deployer means any natural or legal person, public authority, agency or other body using an AI system under its authority except where the AI system is used in the course of a personal non-professional activity;” (Article 3)
Further, you must be mindful of the type of AI system you are using and the effects it might have on your shoppers. The Act prohibits AI practices that impair shopper’s ability to make an informed decision or cause significant harm.
”The following artificial intelligence practices shall be prohibited: (a) the placing on the market, putting into service or use of an AI system that deploys subliminal techniques beyond a person’s consciousness or purposefully manipulative or deceptive techniques, with the objective to materially distorting a person’s behavior by appreciably impairing the person’s ability to make an informed decision, thereby causing the person to take a decision that that person would not have otherwise taken in a manner that causes or is likely to cause that person, another person or group of persons significant harm;” (Article 5)
Again, you need to be extremely careful if you are using AI systems to analyze biometric data or using AI in HR processes as they come under the high-risk classification which brings with it additional requirements:
“Biometric and biometrics-based systems: (a) AI systems intended to be used for biometric identification of natural persons, with the exception of those mentioned in Article 5; (aa) AI systems intended to be used to make inferences about personal characteristics of natural persons on the basis of biometric or biometrics-based data, including emotion recognition systems, with the exception of those mentioned in Article 5;” (Annex III, Points 1(a), 1(aa))
“Employment, workers management and access to self-employment: (b) AI systems intended to be used to make or materially influence decisions affecting the initiation, promotion and termination of work-related contractual relationships, task allocation based on individual behavior or personal traits or characteristics, or for monitoring and evaluating performance and behavior of persons in such relationships;” (Annex III, Point 4(b))
Please note, it is always recommended to consult with a legal expert to fully understand the potential impact of the Act on your operations.