A User Study on Kinesthetic Teaching of Redundant Robots in Task and Configuration Space

Sebastian Wrede, Christian Emmerich, Ricarda Grünberg, Arne Nordmann, Agnes Swadzba, Jochen Steil


The recent advent of compliant and kinematically redundant robots poses new research challenges for human-robot interaction. While these robots provide a great degree of flexibility for the realiza- tion of complex applications, the flexibility gained generates the need for additional modeling steps and definition of criteria for redundancy resolution constraining the robot’s movement generation. The explicit modeling of such criteria usually require experts to adapt the robot’s movement gener- ation subsystem. A typical way of dealing with this configuration challenge is to utilize kinesthetic teaching by guiding the robot to implicitly model the specific constraints in task and configura- tion space. We argue that current programming-by-demonstration approaches are not efficient for kinesthetic teaching of redundant robots and show that typical teach-in procedures are too complex for novice users. In order to enable non-experts to master the configuration and programming of a redundant robot in the presence of non-trivial constraints such as confined spaces, we propose a new interaction scheme combining kinesthetic teaching and learning within an integrated system architecture. We evaluated this approach in a user study with 49 industrial workers at HARTING, a medium-sized manufacturing company. The results show that the interaction concepts implemented on a KUKA Lightweight Robot IV are easy to handle for novice users, demonstrate the feasibility of kinesthetic teaching for implicit constraint modeling in configuration space, and yield significantly improved performance for the teach-in of trajectories in task space.


Compliant robots, redundant robots, kinesthetic teaching,learning, assistance systems

Full Text:



  • There are currently no refbacks.

Creative Commons License
This work is licensed under a Creative Commons Attribution 3.0 License.