Volltext herunterladen
(1.141 MB)
Zitationshinweis
Bitte beziehen Sie sich beim Zitieren dieses Dokumentes immer auf folgenden Persistent Identifier (PID):
https://nbn-resolving.org/urn:nbn:de:0168-ssoar-99897-8
Export für Ihre Literaturverwaltung
Delegation of Moral Tasks to Automated Agents - The Impact of Risk and Context on Trusting a Machine to Perform a Task
[Zeitschriftenartikel]
Abstract The rapid development of automation has led to machines increasingly taking over tasks previously reserved for human operators, especially those involving high-risk settings and moral decision making. To best benefit from the advantages of automation, these systems must be integrated into work envir... mehr
The rapid development of automation has led to machines increasingly taking over tasks previously reserved for human operators, especially those involving high-risk settings and moral decision making. To best benefit from the advantages of automation, these systems must be integrated into work environments, and into the society as a whole. Successful integration requires understanding how users gain acceptance of technology by learning to trust in its reliability. It is, thus, essential to examine factors that influence the integration, acceptance, and use of automated technologies. As such, this study investigated the conditions under which human operators were willing to relinquish control, and delegate tasks to automated agents by examining risk and context factors experimentally. In a decision task, participants ( N=43 , 27 female) were placed in different situations in which they could choose to delegate a task to an automated agent or manual execution. The results of our experiment indicated that both, context and risk, significantly influenced people’s decisions. While it was unsurprising that the reliability of an automated agent seemed to strongly influence trust in automation, the different types of decision support systems did not appear to impact participant compliance. Our findings suggest that contextual factors should be considered when designing automated systems that navigate moral norms and individual preferences.... weniger
Thesaurusschlagwörter
künstliche Intelligenz; Automatisierung; Entscheidungsfindung; Ethik; Maschine; Moral; Risikoabschätzung; Vertrauen
Klassifikation
Technikfolgenabschätzung
Freie Schlagwörter
context; decision support systems (DSSs); decision making; machine ethics; moral decisions; moral machines; Interpersonales Vertrauen (KUSIV3) (ZIS 37)
Sprache Dokument
Englisch
Publikationsjahr
2022
Seitenangabe
S. 46-57
Zeitschriftentitel
IEEE Transactions on Technology and Society, 3 (2022) 1
DOI
https://doi.org/10.1109/TTS.2021.3118355
ISSN
2637-6415
Status
Veröffentlichungsversion; begutachtet (peer reviewed)