The data sets, the processes that determine an algorithmic decision, the rationale of a certain automated decision affecting the legal sphere of a natural person should be traceable, transparent, explained; this is also in order to enable the individual affected to challenge the contents of an unfair decision. Instead, they are rarely so: either by choice - for reasons of competition, of protection of know-how - or because of technological limitations: this is the case of those algorithms that appropriately are referred to as 'black-box'; systems whose inferential mechanisms are not (completely) predictable ex ante or which, in any case, do not always make it possible to explain why an automated decision-making model has generated a particular outcome (and what combination of factors contributed to it). Having affirmed the existence of an ethical duty to transparency of the algorithm and explanation of the (individual) decision reached by automated means, this Paper wonders whether there is a corresponding right on the level of positive law - and what are its limits, of legal but also technological nature. Critically drawing on the most important scholarly opinions on the subject, the right to explanation of the automated decision-making is identified, in the context of the GDPR, in the right of access under Article 15 of the Regulation.

Automated Decision Making and right to explanation. The right of access as ex post information

Emiliano Troisi
2022-01-01

Abstract

The data sets, the processes that determine an algorithmic decision, the rationale of a certain automated decision affecting the legal sphere of a natural person should be traceable, transparent, explained; this is also in order to enable the individual affected to challenge the contents of an unfair decision. Instead, they are rarely so: either by choice - for reasons of competition, of protection of know-how - or because of technological limitations: this is the case of those algorithms that appropriately are referred to as 'black-box'; systems whose inferential mechanisms are not (completely) predictable ex ante or which, in any case, do not always make it possible to explain why an automated decision-making model has generated a particular outcome (and what combination of factors contributed to it). Having affirmed the existence of an ethical duty to transparency of the algorithm and explanation of the (individual) decision reached by automated means, this Paper wonders whether there is a corresponding right on the level of positive law - and what are its limits, of legal but also technological nature. Critically drawing on the most important scholarly opinions on the subject, the right to explanation of the automated decision-making is identified, in the context of the GDPR, in the right of access under Article 15 of the Regulation.
2022
Automated Decision Making, algorithm, opacity, explanation, right of access, Gdpr
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.12570/32575
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
social impact