• DocumentCode
    122912
  • Title

    Accessible interfaces for robot assistants

  • Author

    Lazewatsky, Daniel A. ; Smart, William D.

  • Author_Institution
    Sch. of Mech., Ind., & Manuf. Eng., Oregon State Univ., Corvallis, OR, USA
  • fYear
    2014
  • fDate
    25-29 Aug. 2014
  • Firstpage
    106
  • Lastpage
    111
  • Abstract
    Currently, high-level task control of robots is generally performed by using a graphical interface on a desktop or laptop computer. This type of mediated interaction is not natural, and can be problematic and cumbersome for persons with certain types of motor disabilities, and for people interacting with the robot when there are no computer displays present. In this work, we present a framework which enables the removal of such obvious intermediary devices and allows users to assign tasks to robots using interfaces embedded directly in the world, by projecting these interfaces directly onto surfaces and objects. We describe the implementation of the projected interface framework, and give several examples of tasks which can be performed with such an interface.
  • Keywords
    graphical user interfaces; human-robot interaction; service robots; accessible interfaces; graphical interface; high-level robot task control; human-robot interaction; intermediary device removal; mediated interaction; motor disabled persons; robot assistants; robot task assignment; Context; Robot kinematics; Robot sensing systems; TV; Three-dimensional displays;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Robot and Human Interactive Communication, 2014 RO-MAN: The 23rd IEEE International Symposium on
  • Conference_Location
    Edinburgh
  • Print_ISBN
    978-1-4799-6763-6
  • Type

    conf

  • DOI
    10.1109/ROMAN.2014.6926238
  • Filename
    6926238