• DocumentCode
    28552
  • Title

    An Adaptable Robot Vision System Performing Manipulation Actions With Flexible Objects

  • Author

    Bodenhagen, Leon ; Fugl, Andreas R. ; Jordt, Andreas ; Willatzen, M. ; Andersen, Knud A. ; Olsen, Martin M. ; Koch, Robert ; Petersen, Henrik Gordon ; Kruger, Norbert

  • Author_Institution
    Maersk McKinney Moller Inst., Univ. of Southern Denmark, Odense, Denmark
  • Volume
    11
  • Issue
    3
  • fYear
    2014
  • fDate
    Jul-14
  • Firstpage
    749
  • Lastpage
    765
  • Abstract
    This paper describes an adaptable system which is able to perform manipulation operations (such as Peg-in-Hole or Laying-Down actions) with flexible objects. As such objects easily change their shape significantly during the execution of an action, traditional strategies, e.g, for solve path-planning problems, are often not applicable. It is therefore required to integrate visual tracking and shape reconstruction with a physical modeling of the materials and their deformations as well as action learning techniques. All these different submodules have been integrated into a demonstration platform, operating in real-time. Simulations have been used to bootstrap the learning of optimal actions, which are subsequently improved through real-world executions. To achieve reproducible results, we demonstrate this for casted silicone test objects of regular shape. Note to Practitioners - The aim of this work was to facilitate the setup of robot-based automation of delicate handling of flexible objects consisting of a uniform material. As examples, we have considered how to optimally maneuver flexible objects through a hole without colliding and how to place flexible objects on a flat surface with minimal introduction of internal stresses in the object. Given the material properties of the object, we have demonstrated in these two applications how the system can be programmed with minimal requirements of human intervention. Rather than being an integrated system with the drawbacks in terms of lacking flexibility, our system should be viewed as a library of new technologies that have been proven to work in close to industrial conditions. As a rather basic, but necessary part, we provide a technology for determining the shape of the object when passing on, e.g., a conveyor belt prior to being handled. The main technologies applicable for the manipulated objects are: A method for real-time tracking of the flexible objects during manipulation, a method for model-based offline pr- diction of the static deformation of grasped, flexible objects and, finally, a method for optimizing specific tasks based on both simulated and real-world executions.
  • Keywords
    control engineering computing; deformation; flexible manipulators; image reconstruction; learning (artificial intelligence); object tracking; robot vision; shape recognition; action execution; action learning techniques; adaptable robot vision system; bootstrap learning; casted silicone test objects; demonstration platform; flexible object handling; flexible object maneuver; internal stresses; laying-down actions; manipulation actions; manipulation operations; material properties; model-based offline prediction; object shape; optimal actions; peg-in-hole actions; physical modeling; real-time tracking; real-world executions; robot-based automation; shape reconstruction; static deformation; visual tracking; Deformable models; Mathematical model; Robots; Shape; Splines (mathematics); Surface reconstruction; Surface topography; 3D-modeling; Action learning; deformation modeling; flexible objects; shape tracking;
  • fLanguage
    English
  • Journal_Title
    Automation Science and Engineering, IEEE Transactions on
  • Publisher
    ieee
  • ISSN
    1545-5955
  • Type

    jour

  • DOI
    10.1109/TASE.2014.2320157
  • Filename
    6823725