Teleoperation with Intelligent and Customizable Interfaces
n this paper, we explore a class of teleoperation problems where a user controls a sophisticated device (e.g. a robot) via an interface to perform a complex task. Teleoperation interfaces are funda- mentally limited by the indirectness of the process, by the fact that the user is not physically execut- ing the task. In this work, we study intelligent and customizable interfaces: these are interfaces that mediate the consequences of indirectness and make teleoperation more seamless. They are intelli- gent in that they take advantage of the robot’s autonomous capabilities and assist in accomplishing the task. They are customizable in that they enable the users to adapt the retargetting function which maps their input onto the robot. Our studies support the advantages of such interfaces, but also point out the challenges they bring. We make three key observations. First, although assistance can greatly improve teleperation, the decision on how to provide assistance must be contextual. It must depend, for example, on the robot’s confidence in its prediction of the user’s intent. Second, although users do have the ability to provide intent-expressive input that simplifies the robot’s pre- diction task, this ability can be hindered by kinematic differences between themselves and the robot. And third, although interface customization is important, it must be robust to poor examples from the user.
Teleoperation, Motion Retargetting, Shared Autonomy
- There are currently no refbacks.
This work is licensed under a Creative Commons Attribution 3.0 License.