Download full text
(1.141Mb)
Citation Suggestion
Please use the following Persistent Identifier (PID) to cite this document:
https://nbn-resolving.org/urn:nbn:de:0168-ssoar-99897-8
Exports for your reference manager
Delegation of Moral Tasks to Automated Agents - The Impact of Risk and Context on Trusting a Machine to Perform a Task
[journal article]
Abstract The rapid development of automation has led to machines increasingly taking over tasks previously reserved for human operators, especially those involving high-risk settings and moral decision making. To best benefit from the advantages of automation, these systems must be integrated into work envir... view more
The rapid development of automation has led to machines increasingly taking over tasks previously reserved for human operators, especially those involving high-risk settings and moral decision making. To best benefit from the advantages of automation, these systems must be integrated into work environments, and into the society as a whole. Successful integration requires understanding how users gain acceptance of technology by learning to trust in its reliability. It is, thus, essential to examine factors that influence the integration, acceptance, and use of automated technologies. As such, this study investigated the conditions under which human operators were willing to relinquish control, and delegate tasks to automated agents by examining risk and context factors experimentally. In a decision task, participants ( N=43 , 27 female) were placed in different situations in which they could choose to delegate a task to an automated agent or manual execution. The results of our experiment indicated that both, context and risk, significantly influenced people’s decisions. While it was unsurprising that the reliability of an automated agent seemed to strongly influence trust in automation, the different types of decision support systems did not appear to impact participant compliance. Our findings suggest that contextual factors should be considered when designing automated systems that navigate moral norms and individual preferences.... view less
Keywords
artificial intelligence; automation; decision making; ethics; machine; morality; risk assessment; confidence
Classification
Technology Assessment
Free Keywords
context; decision support systems (DSSs); decision making; machine ethics; moral decisions; moral machines; Interpersonales Vertrauen (KUSIV3) (ZIS 37)
Document language
English
Publication Year
2022
Page/Pages
p. 46-57
Journal
IEEE Transactions on Technology and Society, 3 (2022) 1
DOI
https://doi.org/10.1109/TTS.2021.3118355
ISSN
2637-6415
Status
Published Version; peer reviewed