Skip to Main Content (Press Enter)

Logo UNICATT
  • ×
  • Home
  • Degrees
  • Courses
  • People
  • Outputs
  • Projects
  • Expertise & Skills

UNI-FIND
Logo UNICATT

|

UNI-FIND

unicatt.it
  • ×
  • Home
  • Degrees
  • Courses
  • People
  • Outputs
  • Projects
  • Expertise & Skills
  1. Outputs

Automation is not a moral deus ex machina: electrophysiology of moral reasoning toward machine and human agents

Academic Article
Publication Date:
2022
Short description:
Cassioli, F., Angioletti, L., Balconi, M., Automation is not a moral deus ex machina: electrophysiology of moral reasoning toward machine and human agents, <>, 2022; 71 (4): 391-411. [doi:10.4081/mem.2022.1217] [https://hdl.handle.net/10807/227835]
abstract:
The diffusion of automated decision-making systems could represent a critical crossroads for the future society. Automated technology could feasibly be involved in morally-charged decisions, with major ethical consequences. In the present study, participants (n=34) took part in a task composed of moral dilemmas where the agent (human vs. machine) and the type of behavior (action vs inaction) factors were randomized. Responses in terms of evaluation of morality, the consciousness, responsibility, intentionality, and emotional impact of the agent’s behaviour, reaction times (RTs), and EEG (delta, theta, beta, alpha, gamma powers) data were collected. Data showed that participants apply different moral rules based on the agent. Humans are considered more moral, responsible, intentional, and conscious compared to machines. Interestingly, the evaluation of the emotional impact derived from the moral behavior was perceived as more severe for humans, with decreased RTs. For EEG data, increased gamma power was detected when subjects were evaluating the intentionality and the emotional impact of machines, compared to humans. Higher beta power in the frontal and fronto-central regions was detected for the evaluation of the machine’s derived emotional impact. Moreover, a right temporal activation was found when judging the emotional impact caused by humans. Lastly, a generalized alpha desynchronization occurred in the left occipital area, when subjects evaluated the responsibility derived from inaction behaviors. Present results provided evidence for the existence of different norms when judging moral behavior of machine and human agents, pointing to a possible asymmetry in moral judgment at a cognitive and emotional level.
Iris type:
Articolo in rivista, Nota a sentenza
Keywords:
automation; human-robot ethics; moral dilemma
List of contributors:
Cassioli, Federico; Angioletti, Laura; Balconi, Michela
Handle:
https://publicatt.unicatt.it/handle/10807/227835
Published in:
MEDICINA E MORALE
Journal
  • Research Fields

Research Fields

Concepts (2)


LS5_9 - Neural basis of cognition - (2022)

Settore M-PSI/02 - PSICOBIOLOGIA E PSICOLOGIA FISIOLOGICA
  • Use of cookies

Powered by VIVO | Designed by Cineca | 26.4.5.0