SSOAR Logo
    • Deutsch
    • English
  • Deutsch 
    • Deutsch
    • English
  • Einloggen
SSOAR ▼
  • Home
  • Über SSOAR
  • Leitlinien
  • Veröffentlichen auf SSOAR
  • Kooperieren mit SSOAR
    • Kooperationsmodelle
    • Ablieferungswege und Formate
    • Projekte
  • Kooperationspartner
    • Informationen zu Kooperationspartnern
  • Informationen
    • Möglichkeiten für den Grünen Weg
    • Vergabe von Nutzungslizenzen
    • Informationsmaterial zum Download
  • Betriebskonzept
Browsen und suchen Dokument hinzufügen OAI-PMH-Schnittstelle
JavaScript is disabled for your browser. Some features of this site may not work without it.

Download PDF
Volltext herunterladen

(232.5 KB)

Zitationshinweis

Bitte beziehen Sie sich beim Zitieren dieses Dokumentes immer auf folgenden Persistent Identifier (PID):
https://nbn-resolving.org/urn:nbn:de:0168-ssoar-75528-7

Export für Ihre Literaturverwaltung

Bibtex-Export
Endnote-Export

Statistiken anzeigen
Weiterempfehlen
  • Share via E-Mail E-Mail
  • Share via Facebook Facebook
  • Share via Bluesky Bluesky
  • Share via Reddit reddit
  • Share via Linkedin LinkedIn
  • Share via XING XING

Detecting Race and Gender Bias in Visual Representation of AI on Web Search Engines

[Konferenzbeitrag]

Makhortykh, Mykola
Urman, Aleksandra
Ulloa, Roberto

Abstract

Web search engines influence perception of social reality by filtering and ranking information. However, their outputs are often subjected to bias that can lead to skewed representation of subjects such as professional occupations or gender. In our paper, we use a mixed-method approach to investigat... mehr

Web search engines influence perception of social reality by filtering and ranking information. However, their outputs are often subjected to bias that can lead to skewed representation of subjects such as professional occupations or gender. In our paper, we use a mixed-method approach to investigate presence of race and gender bias in representation of artificial intelligence (AI) in image search results coming from six different search engines. Our findings show that search engines prioritize anthropomorphic images of AI that portray it as white, whereas non-white images of AI are present only in non-Western search engines. By contrast, gender representation of AI is more diverse and less skewed towards a specific gender that can be attributed to higher awareness about gender bias in search outputs. Our observations indicate both the need and the possibility for addressing bias in representation of societally relevant subjects, such as technological innovation, and emphasize the importance of designing new approaches for detecting bias in information retrieval systems.... weniger

Thesaurusschlagwörter
Repräsentation; künstliche Intelligenz; Algorithmus; Online-Dienst; Trend; Suchmaschine; information retrieval

Klassifikation
interaktive, elektronische Medien

Freie Schlagwörter
web search; bias; artificial intelligence

Titel Sammelwerk, Herausgeber- oder Konferenzband
Advances in Bias and Fairness in Information Retrieval

Herausgeber
Boratto, Ludovico; Faralli, Stefano; Marras, Mirko; Stilo, Giovanni

Konferenz
Second International Workshop on Algorithmic Bias in Search and Recommendation, BIAS 2021. Lucca, Italy

Sprache Dokument
Englisch

Publikationsjahr
2021

Verlag
Springer

Seitenangabe
S. 1-16

Schriftenreihe
Communications in Computer and Information Science, 1418

DOI
https://doi.org/10.1007/978-3-030-78818-6_5

ISBN
978-3-030-78818-6

Status
Preprint; nicht begutachtet

Lizenz
Deposit Licence - Keine Weiterverbreitung, keine Bearbeitung


GESIS LogoDFG LogoOpen Access Logo
Home  |  Impressum  |  Betriebskonzept  |  Datenschutzerklärung
© 2007 - 2025 Social Science Open Access Repository (SSOAR).
Based on DSpace, Copyright (c) 2002-2022, DuraSpace. All rights reserved.
 

 


GESIS LogoDFG LogoOpen Access Logo
Home  |  Impressum  |  Betriebskonzept  |  Datenschutzerklärung
© 2007 - 2025 Social Science Open Access Repository (SSOAR).
Based on DSpace, Copyright (c) 2002-2022, DuraSpace. All rights reserved.