Intuitive Gesture-based Robot Teach-Mode
Bachelor- or Master-Thesis at group TAMS
Motivation
Despite a lot of progress in robot motion planning algorithms,
automatically generated trajectories are often far from optimal,
and most industrial robots are still programmed to follow
human-designed trajectories.
This is done either by following hard-coded waypoints
or using a teach-mode, where the user can drag the real robot
to the desired positions.
The first approach results in mathematically precise coordinates,
but is highly un-intuitive, while the second approach requires
a lot of force and precise way-points are hard to reach
due to friction and the high inertia of the robot.
With typical current software, the recorded trajectories are fixed
and cannot easily be modified.
However, there has been a lot of progress recently
in visual object tracking and in particular in markerless
tracking of human hands and fingers using deep networks.
In the proposed thesis, we will develop a human hand tracking
and gesture detection interface to create robot trajectories
interactively, by just pointing at the desired waypoints
and by moving between waypoints.
While the teaching is in progress, a 3D-visualization
of the trajectory will be generated (rviz, Hololens VR)
and provide visual feedback to the user.
Gestures will be eefined to select waypoints and to
move, copy, or delete them, and to adjust and finetune
the trajectory.
Other gestures will be provided to define collision objects
and to draw regions that should be avoided (or preferred)
by the robot motions.
In addition to the visualization of the tool center point
trajectory, we will also enable to visualize the motion
of other parts of the robot (elbow, shoulder, knees) and
to select and adjust those motions as well.
The precise scope of the software implementation is to be discussed,
with a basic functionality achievable in a BSc-thesis,
but a really useful function set probably requires a full MSc-thesis.
Given the current Corona restrictions,
large parts of the software can be designed (and tested) in home office,
using a setup consisting of a RGB or RGB-D camera and the underlying
hand/finger tracking software.
However, actual evaluation of the software will probably require at
least some tests and experiments on the real robot in our lab.
Thesis Goals:
- development of a hand-tracking and gesture-based robot teaching tool,
- to define (add,adjust,move,copy,delete) way-points,
- for the robot end-effector or other parts of the robot (wrist,elbow,shoulder,knees),
- integrated into a 3D graphical user-interface,
- optionally integrated into a virtual/augmented reality display (Hololens2),
- using the ROS open-source framework and environment
- etc.
- collect ideas and experiment data leading to a publication (MSc-thesis only)
Requirements
- as always, interest in the topic area, here camera-based hand and gesture tracking
- basic knowledge of deep-learning based visual tracking
- basic knowledge of the ROS framework
- programming skills in Python or C/C++ and Qt
Contact