Por favor, use este identificador para citar o enlazar este ítem: https://repositorio.consejodecomunicacion.gob.ec//handle/CONSEJO_REP/7959
Registro completo de metadatos
Campo DC Valor Lengua/Idioma
dc.contributor.authorGeiger, R.-
dc.contributor.authorTandon, Udayan-
dc.contributor.authorGakhokidze, Anoolia-
dc.contributor.authorSong, Lian-
dc.contributor.authorIrani, Lilly-
dc.date.accessioned2024-09-30T21:15:56Z-
dc.date.available2024-09-30T21:15:56Z-
dc.date.issued2024-
dc.identifier.citationGeiger, R., Tandon, U., Gakhokidze, A., Song, L., and Irani, L. (2023). Rethinking Artificial Intelligence: Algorithmic Bias and Ethical Issues| Making Algorithms Public: Reimagining Auditing From Matters of Fact to Matters of Concern. International Journal Of Communication, 18, 22.https://ijoc.org/index.php/ijoc/article/view/20811/4455es_ES
dc.identifier.issn1932-8036-
dc.identifier.urihttps://repositorio.consejodecomunicacion.gob.ec//handle/CONSEJO_REP/7959-
dc.description.abstractStakeholders concerned with bias, discrimination, and fairness in algorithmic systems are increasingly turning to audits, which typically apply generalizable methods and formal standards to investigate opaque systems. We discuss four attempts to audit algorithmic systems with varying levels of success—depending on the scope of both the system to be audited and the audit’s success criteria. Such scoping is contestable, negotiable, and political, linked to dominant institutions and movements to change them. Algorithmic auditing is typically envisioned as settling “matters-of-fact” about how opaque algorithmic systems behave: definitive declarations that (de)certify a system. However, there is little consensus about the decisions to be automated or about the institutions automating them. We reposition algorithmic auditing as an ongoing and ever-changing practice around “matters-of-concern.” This involves building infrastructures for the public to engage in open-ended democratic understanding, contestation, and problem solving—not just about algorithms in themselves, but the institutions and power structures deploying them. Auditors must recognize their privilege in scoping to “relevant” institutional standards and concerns, especially when stakeholders seek to reform or reimagine them.es_ES
dc.language.isoenes_ES
dc.publisherInternational Journal of Communicationes_ES
dc.subjectactivismes_ES
dc.subjectartificiales_ES
dc.subjectauditinges_ES
dc.titleRethinking Artificial Intelligence: Algorithmic Bias and Ethical Issues| Making Algorithms Public: Reimagining Auditing From Matters of Fact to Matters of Concernes_ES
dc.title.alternativeInternational Journal of Communicationes_ES
dc.typeArticlees_ES
Aparece en las colecciones: Documentos internacionales sobre libertad de expresión y derechos conexos

Ficheros en este ítem:
Fichero Descripción Tamaño Formato  
Making algorithms public.pdfMaking algorithms951,59 kBAdobe PDFVisualizar/Abrir


Los ítems de DSpace están protegidos por copyright, con todos los derechos reservados, a menos que se indique lo contrario.