Desafíos actuales de la Inteligencia Artificial

94 Desafíos actuales de la Inteligencia Artificial Cyando’s involvement too high to remain neutral and passive. Additionally, it is impor- tant to note that while Generative AI applications appear to actively produce content, the process could be viewed as automated and passive 41 , entirely dependent on user prompts. Although the content reflects usually the training of the AI model and the design choices of the developers, these choices are abstract compared to the specific outputs, which rely heavily on user input 42 . Thus, Generative AI applications might also warrant an individual and independent assessment similar to what the CJEU did in YouTube and Cyando for content storing and sharing applications 43 . Fourth, the policy argument 44 that Generative AI applications might not deserve liabil- ity immunities similar to those for traditional platforms due to their different nature merits attention. However, it may not be decisive in this context, as the issue of fake news, misin- formation, disinformation, and content misuse is not exclusively about liability. Even critics of applying the DSA to Generative AI tools acknowledge the usefulness of the DSA’s con- tent moderation mechanisms against misleading, fake, or harmful AI-generated content. The DSA offers a potentially elegant solution, as Recital 41 suggests that the immunity regime and content moderation rules should be assessed independently 45 . Thus, irrespective of whether perform a ‘communication to the public’ is related to their intermediary status under art. 6 of the DSA (the case was decided under art. 14 of the Electronic Commerce Directive, equivalent to today’s art. 6 of the DSA). Therefore, even though content-sharing platforms index content, provide search tools, and suggest content based on user preferences, the Court affirmed they neither perform a ‘communication to the public’ nor lose their intermediary status. Thus, the argument of Hacker et al. (supra note 21) overlooks the nuances of CJEU intermediary case law. 41 As neatly put by BOTERO ARCILA, supra note 30, p. 484: “... ChatGPT’s answers are assembled using algo- rithms that predict what makes sense as the next word, based on a user’s prompt. OpenAI, thus, does not have any knowledge or an active role in controlling the content ChatGPT generates. It could thus be argued that its role is neutral in a similar way to how YouTube is neutral in hosting user-generated content …”. 42 On the decisive nature of user prompts in the context of Generative AI applications see PATEL, David, “Arti- ficial Intelligence & Generative AI for Beginners”, Kindle Edition, 2024, p. 118-139. 43 For an overall assessment of Generative AI in the context of US law, specifically § 230 of 47 U.S.C., a provision not too dissimilar from the discussion in this paper under EU law, see BAMBAUER, Derek; SURDEANU, Mihai, “Authorbots”, Journal of Free Speech Law, 2023 3(2), pp. 375-388. Their analysis demonstrates the nuanced reality of whether Generative AI applications can be considered content creators or mere tools serving the users who dictate the parameters of the output through their prompts. 44 As put forward by Edwards et. al, supra note 26. 45 Recital 41 of the DSA reads as follows: “In that regard, it is important that the due diligence obligations are adapted to the type, size and nature of the intermediary service concerned. This Regulation therefore sets out basic obligations applicable to all providers of intermediary services, as well as additional obligations for providers of hosting services and, more specifically, providers of online platforms and of very large online platforms and of very large online search engines. To the extent that providers of intermediary services fall within a number of different categories in view of the nature of their services and their size, they should comply with all the corresponding obligations of this Regulation in relation to those services. Those harmonised due diligence obligations, which should be reasonable and nonarbitrary, are needed to address the identified public policy concerns, such as safeguarding the legitimate interests of the recipients of the service, addressing illegal practices and protecting the fundamental rights enshrined in the Charter. The due diligence obligations

RkJQdWJsaXNoZXIy NTEwODM=