Desafíos actuales de la Inteligencia Artificial
80 Desafíos actuales de la Inteligencia Artificial though in the data space will still be available (Article 10(5)(d)). In this respect, the relevant financial data sharing scheme and data use perimeter, as provided for under Articles 7 and 10 of the FiDAR proposal respectively, would lay out the applicable confidentiality obligations required by the AI Act. Furthermore, Article 10(5) requires the erasure of the data used, after it has served its purpose. If the data was obtained via a data space, we can ask ourselves: complying with Article 10(5)(e) implies deleting the data only from the dataset the AI developer used or also from the data space itself ? In this respect, we can propose that, for Article 10(5) AI Act, it is only required the deletion from the AI developer side after the task is completed. It must be acknowledged the risk that the data included in these spaces used to train algorithmic credit scoring systems may lead to biased results if there is no process of bias detection or correction (Balayn and Gürses, 2021; Van Bekkum and Zuiderveen, 2023). 5. CONCLUDING REMARKS AND POSSIBLE RECOMMENDATIONS As analysed in our contribution, the role of data spaces as sources for the necessary data to de-bias an AI system might not be as straightforward as envisaged by the AI Act. This is because some of these necessary data categories might not be available under the specific sectoral data space regulation; and, even when available, these might be subject to further requirements. Recital 70 of the AI Act justifies the processing of special categories of data to correct and mitigate bias as a substantial public interest. In this respect, we can argue whether the answers to balance between these different legal instruments lays elsewhere. Given that we are dealing with personal data, GDPR inevitably comes to the foreground in this process through its principles. In this respect, the principle of data minimisation could offer some answers. Under the FiDAR proposal, data use perimeters should be established, 9 meaning that personal data shall be limited to what is necessary concerning the purposes for which they are processed, despite the intention of ensuring a free flow of data. At the same time, Article 10(5) AI Act also in- tends to limit data used to bias detection and correction as the minimum possible. However, it is possible to question if, to effectively perform a mandated task, maybe the more relevant data available the better, even if that means involving data categories originally excluded. From our perspective, data spaces can be used to gather a quantity and diversity of data -only available there- that can be very useful to correct and detect biases of AI systems, such as creditworthiness systems. In this sense, for example, the FiDAR proposal has the necessary tools to ensure that special categories of personal data are adequately used to detect and correct biases. In this re- gard, we suggest that when drafting the guidelines for the data use perimeters as well as the fi- nancial data sharing schemes, the objectives behind provisions such as Article 10(5) AI Act are 9 Article 7 FiDAR proposal
Made with FlippingBook
RkJQdWJsaXNoZXIy NTEwODM=