Intransparente Diskriminierung durch maschinelles Lernen
Schlagworte:Diskriminierung, maschinelles Lernen, Algorithmen, Explainable Artificial Intelligence, statistische Diskriminierung, Intransparenz
Key words:discrimination, machine learning, algorithms, explainable artificial intelligence, statistical discrimination, intransparency
Machine Learning can lead to discrimination based on new features that are intransparent. In order to show this, I will argue that discrimination cannot only occur based on features in exhaustive lists but on any feature and that algorithms can be discriminating in spite of a lack of mental attributes. I will show some problems that come with new forms of discrimination that are not easily detectable because of the intransparency of the algorithms. Three forms of intransparency that have different effects on the possibility to detect discriminations have to be distinguished: (i) features of differential treatment can be unknown (ii) features of differential treatment can be unimaginable (they are too complex or “chaotic” to be grasped by humans) (iii) explanations for the use of certain features for differential treatment could lack Especially the combination of intransparency and new features of discrimination is challenging for the philosophical debate about discrimination. Previous solutions were restricted to identify or create imaginable features that are explicitly protected by anti-discrimination law. New forms of discrimination that could have considerable impact in the future are neglected. In order to identify intransparent discriminations unknown features have to be identified, unimaginable features have to be made sufficiently imaginable and correlations and training methods have to be explainable. Then it has to be possible to remove discriminating elements. Insofar no suitable solutions can be found, it must be considered whether we want to allow for specific contexts only those machine learning methods which can sufficiently be examined for discriminating consequences.
Copyright (c) 2020 Heiner Koch
Dieses Werk steht unter der Lizenz Creative Commons Namensnennung 4.0 International.