SSOAR Logo
    • Deutsch
    • English
  • Deutsch 
    • Deutsch
    • English
  • Einloggen
SSOAR ▼
  • Home
  • Über SSOAR
  • Leitlinien
  • Veröffentlichen auf SSOAR
  • Kooperieren mit SSOAR
    • Kooperationsmodelle
    • Ablieferungswege und Formate
    • Projekte
  • Kooperationspartner
    • Informationen zu Kooperationspartnern
  • Informationen
    • Möglichkeiten für den Grünen Weg
    • Vergabe von Nutzungslizenzen
    • Informationsmaterial zum Download
  • Betriebskonzept
Browsen und suchen Dokument hinzufügen OAI-PMH-Schnittstelle
JavaScript is disabled for your browser. Some features of this site may not work without it.

Download PDF
Volltext herunterladen

(externe Quelle)

Zitationshinweis

Bitte beziehen Sie sich beim Zitieren dieses Dokumentes immer auf folgenden Persistent Identifier (PID):
https://doi.org/10.17645/si.7471

Export für Ihre Literaturverwaltung

Bibtex-Export
Endnote-Export

Statistiken anzeigen
Weiterempfehlen
  • Share via E-Mail E-Mail
  • Share via Facebook Facebook
  • Share via Bluesky Bluesky
  • Share via Reddit reddit
  • Share via Linkedin LinkedIn
  • Share via XING XING

The Artificial Recruiter: Risks of Discrimination in Employers' Use of AI and Automated Decision‐Making

[Zeitschriftenartikel]

Larsson, Stefan
White, James Merricks
Ingram Bogusz, Claire

Abstract

Extant literature points to how the risk of discrimination is intrinsic to AI systems owing to the dependence on training data and the difficulty of post hoc algorithmic auditing. Transparency and auditability limitations are problematic both for companies' prevention efforts and for government over... mehr

Extant literature points to how the risk of discrimination is intrinsic to AI systems owing to the dependence on training data and the difficulty of post hoc algorithmic auditing. Transparency and auditability limitations are problematic both for companies' prevention efforts and for government oversight, both in terms of how artificial intelligence (AI) systems function and how large‐scale digital platforms support recruitment processes. This article explores the risks and users' understandings of discrimination when using AI and automated decision‐making (ADM) in worker recruitment. We rely on data in the form of 110 completed questionnaires with representatives from 10 of the 50 largest recruitment agencies in Sweden and representatives from 100 Swedish companies with more than 100 employees ("major employers"). In this study, we made use of an open definition of AI to accommodate differences in knowledge and opinion around how AI and ADM are understood by the respondents. The study shows a significant difference between direct and indirect AI and ADM use, which has implications for recruiters' awareness of the potential for bias or discrimination in recruitment. All of those surveyed made use of large digital platforms like Facebook and LinkedIn for their recruitment, leading to concerns around transparency and accountability - not least because most respondents did not explicitly consider this to be AI or ADM use. We discuss the implications of direct and indirect use in recruitment in Sweden, primarily in terms of transparency and the allocation of accountability for bias and discrimination during recruitment processes.... weniger

Thesaurusschlagwörter
künstliche Intelligenz; Diskriminierung; Transparenz; Entscheidungsfindung; Schweden; Algorithmus; neue Technologie; Arbeitskräfte; Rekrutierung; Verantwortung

Klassifikation
Personalwesen
Wissenschaftssoziologie, Wissenschaftsforschung, Technikforschung, Techniksoziologie

Freie Schlagwörter
ADM and risks of discrimination; AI and accountability; AI and risks of discrimination; AI and transparency; automated decision‐making; discrimination in recruitment; indirect AI use; platforms and discrimination

Sprache Dokument
Englisch

Publikationsjahr
2024

Zeitschriftentitel
Social Inclusion, 12 (2024)

Heftthema
Artificial Intelligence and Ethnic, Religious, and Gender-Based Discrimination

ISSN
2183-2803

Status
Veröffentlichungsversion; begutachtet (peer reviewed)

Lizenz
Creative Commons - Namensnennung 4.0


GESIS LogoDFG LogoOpen Access Logo
Home  |  Impressum  |  Betriebskonzept  |  Datenschutzerklärung
© 2007 - 2025 Social Science Open Access Repository (SSOAR).
Based on DSpace, Copyright (c) 2002-2022, DuraSpace. All rights reserved.
 

 


GESIS LogoDFG LogoOpen Access Logo
Home  |  Impressum  |  Betriebskonzept  |  Datenschutzerklärung
© 2007 - 2025 Social Science Open Access Repository (SSOAR).
Based on DSpace, Copyright (c) 2002-2022, DuraSpace. All rights reserved.