Por favor, use este identificador para citar o enlazar este ítem: https://repositorio.consejodecomunicacion.gob.ec//handle/CONSEJO_REP/7956
Registro completo de metadatos
Campo DC Valor Lengua/Idioma
dc.contributor.authorUtz, Sonja-
dc.date.accessioned2024-09-30T21:11:37Z-
dc.date.available2024-09-30T21:11:37Z-
dc.date.issued2024-
dc.identifier.citationUtz, S. (2023). Rethinking Artificial Intelligence: Algorithmic Bias and Ethical Issues| How Gender and Type of Algorithmic Group Discrimination Influence Ratings of Algorithmic Decision Making. International Journal Of Communication, 18, 20. https://ijoc.org/index.php/ijoc/article/view/20806/4452es_ES
dc.identifier.issn1932-8036-
dc.identifier.urihttps://repositorio.consejodecomunicacion.gob.ec//handle/CONSEJO_REP/7956-
dc.description.abstractAlgorithms frequently discriminate against certain groups, and people generally reject such unfairness. However, people sometimes display an egocentric bias when choosing between fairness rules. Two online experiments were conducted to explore whether egocentric biases influence the judgment of biased algorithms. In Experiment 1, an unbiased algorithm was compared with an algorithm favoring males and an algorithm favoring married people. Experiment 2 focused only on the first two conditions. Instead of the expected gender difference in the condition in which the algorithm favored males, a gender difference in the unbiased condition was found in both experiments. Women perceived the unfair algorithm as less fair than men did. Women also perceived the algorithm favoring married people as the least fair. Fairness ratings, however, did not directly translate into permissibility ratings. The results show that egocentric biases are subtle and that women take the social context more into account than men do.es_ES
dc.language.isoenes_ES
dc.publisherInternational Journal of Communicationes_ES
dc.subjectacceptancees_ES
dc.subjectfairnesses_ES
dc.subjectegocentrices_ES
dc.titleRethinking Artificial Intelligence: Algorithmic Bias and Ethical Issues| How Gender and Type of Algorithmic Group Discrimination Influence Ratings of Algorithmic Decision Makinges_ES
dc.title.alternativeInternational Journal of Communicationes_ES
dc.typeArticlees_ES
Aparece en las colecciones: Documentos internacionales sobre libertad de expresión y derechos conexos

Ficheros en este ítem:
Fichero Descripción Tamaño Formato  
How gender and type of.pdfHow gender and type256,51 kBAdobe PDFVisualizar/Abrir


Los ítems de DSpace están protegidos por copyright, con todos los derechos reservados, a menos que se indique lo contrario.