SSOAR Logo
    • Deutsch
    • English
  • English 
    • Deutsch
    • English
  • Login
SSOAR ▼
  • Home
  • About SSOAR
  • Guidelines
  • Publishing in SSOAR
  • Cooperating with SSOAR
    • Cooperation models
    • Delivery routes and formats
    • Projects
  • Cooperation partners
    • Information about cooperation partners
  • Information
    • Possibilities of taking the Green Road
    • Grant of Licences
    • Download additional information
  • Operational concept
Browse and search Add new document OAI-PMH interface
JavaScript is disabled for your browser. Some features of this site may not work without it.

Download PDF
Download full text

(external source)

Citation Suggestion

Please use the following Persistent Identifier (PID) to cite this document:
https://doi.org/10.17645/si.7543

Exports for your reference manager

Bibtex export
Endnote export

Display Statistics
Share
  • Share via E-Mail E-Mail
  • Share via Facebook Facebook
  • Share via Bluesky Bluesky
  • Share via Reddit reddit
  • Share via Linkedin LinkedIn
  • Share via XING XING

Intersectionality in Artificial Intelligence: Framing Concerns and Recommendations for Action

[journal article]

Ulnicane, Inga

Abstract

While artificial intelligence (AI) is often presented as a neutral tool, growing evidence suggests that it exacerbates gender, racial, and other biases leading to discrimination and marginalization. This study analyzes the emerging agenda on intersectionality in AI. It examines four high‐profile rep... view more

While artificial intelligence (AI) is often presented as a neutral tool, growing evidence suggests that it exacerbates gender, racial, and other biases leading to discrimination and marginalization. This study analyzes the emerging agenda on intersectionality in AI. It examines four high‐profile reports dedicated to this topic to interrogate how they frame problems and outline recommendations to address inequalities. These four reports play an important role in putting problematic intersectionality issues on the political agenda of AI, which is typically dominated by questions about AI's potential social and economic benefits. The documents highlight the systemic nature of problems that operate like a negative feedback loop or vicious cycle with the diversity crisis in the AI workforce leading to the development of biased AI tools when a largely homogenous group of white male developers and tech founders build their own biases into AI systems. Typical examples include gender and racial biases embedded into voice assistants, humanoid robots, and hiring tools. The reports frame the diversity situation in AI as alarming, highlight that previous diversity initiatives have not worked, emphasize urgency, and call for a holistic approach that focuses not just on numbers but rather on culture, power, and opportunities to exert influence. While dedicated reports on intersectionality in AI provide a lot of depth, detail, and nuance on the topic, in the patriarchal system they are in danger of being pigeonholed as issues of relevance mainly for women and minorities rather than part of the core agenda.... view less

Keywords
artificial intelligence; politics; power; intersectionality; gender; feminism; diversity; data; technology; discrimination

Classification
Sociology of Science, Sociology of Technology, Research on Science and Technology

Free Keywords
framing

Document language
English

Publication Year
2024

Journal
Social Inclusion, 12 (2024)

Issue topic
Artificial Intelligence and Ethnic, Religious, and Gender-Based Discrimination

ISSN
2183-2803

Status
Published Version; peer reviewed

Licence
Creative Commons - Attribution 4.0


GESIS LogoDFG LogoOpen Access Logo
Home  |  Legal notices  |  Operational concept  |  Privacy policy
© 2007 - 2025 Social Science Open Access Repository (SSOAR).
Based on DSpace, Copyright (c) 2002-2022, DuraSpace. All rights reserved.
 

 


GESIS LogoDFG LogoOpen Access Logo
Home  |  Legal notices  |  Operational concept  |  Privacy policy
© 2007 - 2025 Social Science Open Access Repository (SSOAR).
Based on DSpace, Copyright (c) 2002-2022, DuraSpace. All rights reserved.