Desafíos actuales de la Inteligencia Artificial

Enforcing AI regulation in France: a legal framework beyond the AI act 61 does so by categorizing AI systems by risk level and imposes obligations accordingly. 25 In practice, this means that most AI systems will not be heavily regulated. The AI Act interacts with the GDPR in a number of ways. First of all, the AI Act speci- fied that it does not alter existing EU laws on personal data processing or the duties of super- visory authorities overseeing compliance. Thus, it maintains the obligations of AI providers and deployers as data controllers or processors under EU or national data protection laws. Similarly, data subjects retain all rights under these laws, including those related to automated decision-making and profiling. However, the AI Act complements the GDPR with regards to (i) data quality; and (ii) the competence of national data protection authorities. 26 First, with regards to data quality, the Act mandates that high-risk AI systems be trained on high-quality data to ensure AI systems are safe, effective, unbiased and non-discriminato- ry, adhering to data protection laws like the GDPR. 27 However, the AI Act complements the GDPR because, contrary to the latter, its requirements are not only limited to personal data. Yet, the AI Act foresees an exception to the strict GDPR rules on special categories of personal data. 28 Indeed, the GDPR in principle prohibits the processing of such data and only allows it in limited circumstances. 29 But the AI Act adds a new exception by giving the possibility for providers of high-risk AI systems to exceptionally process these sensitive data to 25 The regulation requires CE marking for high-risk uses, achieved through the compliance with significant obli- gations for AI systems providers, deployers, importers, and distributors, controlled by rigorous audits (Article 6-49). General purpose AI models are generally subject to limited obligations, for instance related to documen- ting their sources of training, and respect for copyright (Article 51-54), except those involving systemic risks based on their increased capabilities which are also subject to significant obligations (Article 55). But low-risk AI systems face lighter requirements, for example in terms of transparency (Article 50). Yet, in order to protect fundamental rights and European values, some AI systems involving unacceptable risks are prohibited outright (Article 5). 26 Another one relates to regulatory sandboxes. The AI Act allows AI providers to use personal data lawfully collected for another purpose to develop, train, or test certain AI systems in those sandboxes, but only if they serve public interests (e.g., public safety, public health, environment, energy sustainability, transports, or public administration). In order to avoid inconsistencies with the GDPR (EDPB-EDPS, “Joint Opinion 5/2021 on the proposal for a Regulation of the European Parliament and of the Council laying down harmonised rules on artificial intelligence (Artificial Intelligence Act)”, 18 June 2021, https://www.edpb.europa.eu/system/fi- les/2021-06/edpb-edps_joint_opinion_ai_regulation_en.pdf) , for example the principle of purpose limitation, the AI Act outlines strict conditions under which this personal data can be used within regulatory sandboxes. For instance, it foresees safeguards for sensitive data and explicitly provides that this possibility should not cons- titute an exemption to the right not to be subject to a decision based solely on automated processing, including profiling (Article 22 GDPR). 27 Article 10 AI Act. Data used for AI training and testing must be relevant, representative, complete, and error- free, with clear transparency on data collection purposes. Privacy-preserving techniques should be used, and data should reflect specific usage contexts. 28 That is, personal data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, trade union membership, genetic data, biometric data for unique identification, data concerning health, and data regarding a natural person’s sex life or sexual orientation (Article 9(1) GDPR). 29 Article 9(1) GDPR.

RkJQdWJsaXNoZXIy NTEwODM=