• DocumentCode
    229596
  • Title

    Mechanizing modal psychology

  • Author

    Bello, Paul F.

  • Author_Institution
    Human & Bioengineered Syst, Office of Naval Res., Arlington, VA, USA
  • fYear
    2014
  • fDate
    23-24 May 2014
  • Firstpage
    1
  • Lastpage
    8
  • Abstract
    Machines are becoming more capable of substantively interacting with human beings as part of simple dyads and within the confines of our complex social structures. Thought must be given to how their behavior might be regulated with respect to the norms and conventions by which we live. This is certainly true for the military domain [1], but is no less true for eldercare, health care, disaster relief and law enforcement; all areas where robotic systems are poised to make tremendous impact in the near future. But how should we inculcate sensitivity to normative considerations in the next generation of intelligent systems? I argue here for an approach to building moral machines grounded in cognitive architectural considerations, and specifically in the dynamics of how alternatives are represented and reasoned over. After examining some recent results in the empirical literature on human moral judgment, I suggest some desiderata for knowledge representation and reasoning tools that may offer the means to capture some of the foundations of human moral cognition.
  • Keywords
    cognition; knowledge representation; psychology; cognitive architectural considerations; complex social structures; human moral cognition; human moral judgment; intelligent systems; knowledge representation; modal psychology; moral machines; reasoning tools; Cognition; Communities; Computer architecture; Ethics; Medical services; Psychology; Robots; cognitive architecture; counterfactual reasoning; moral intuitions;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Ethics in Science, Technology and Engineering, 2014 IEEE International Symposium on
  • Conference_Location
    Chicago, IL
  • Type

    conf

  • DOI
    10.1109/ETHICS.2014.6893425
  • Filename
    6893425