Consider the problem of robustly
tracking a desired workspace trajectory with a humanoid robot. In [1],
we propose a solution based on the definition of a suitable controlled
output, which represents an averaged motion of the torso after
cancellation of the sway oscillation. The torso motion is reconstructed
using the vision-based odometric
localization method previously presented in [2] and described here. For control design purposes, a unicycle-like model is associated to the evolution of the output signal. The following block scheme summarizes the
developed control paradigm.
To validate the proposed trajectory
control scheme, we performed some experiments on the humanoid
robot NAO (version 4.0) by Aldebaran Robotics. In our implementation,
the controller updates the robot driving and steering velocity inputs
at 100 Hz.
These commands are then sent to the robot using the NAO APIs, and in
particular the built-in move
function. Since the most recent command overrides all previous
commands, this function can be called with arbitrary rate, thus
providing a convenient mechanism for real-time implementation of a
high-level control loop.
Desired trajectory: line
In the first tracking experiment, the
desired trajectory is a line.
These are the results obtained using low-pass filtering for sway motion cancellation.
In particular, the left plot shows the desired trajectory vs. the actual trajectory of the torso, as estimated by our odometric localization algorithm, whereas the right plot shows the controlled variable vs. the reference signal. The root mean square of the cartesian error is 0.0330 m in this case.
For comparison, here are the corresponding results obtained using geometric projection for
sway motion cancellation. The rms error in this case is slightly larger (0.0808 m).
Desired trajectory: sigmoid
In the second experiment, the
desired trajectory is sigmoidal.
As shown below, results are
satisfactory for both sway cancellation methods, again with a slight
advantage for low-pass filtering (first row, rms error
is 0.0186 m) that achieves a slightly
smoother motion w.r.t. geometric projection (second row, rms error is 0.0191 m).
Video clip
The following clip illustrates the experiments.
[1] G. Oriolo, A. Paolillo, L. Rosa and M. Vendittelli, Vision-Based Trajectory Control for Humanoid Navigation. 2013 IEEE-RAS Int. Conf. on Humanoid Robots, Atlanta, GA, Oct 2013 (pdf).
[2] G. Oriolo, A. Paolillo, L. Rosa and M. Vendittelli, Vision-Based Odometric Localization for Humanoids using a Kinematic EKF. 2012 IEEE-RAS Int. Conf. on Humanoid Robots, Osaka, Japan, Nov-Dec 2012 (pdf).