Sensor-Based Task-Constrained Motion Planning using Model Predictive Control
Consider a robotic system that must perform a task, e.g., move its
end-effector along an assigned trajectory. Assume that the workspace is
populated by moving obstacles whose geometry and motion is not known in
advance: there is however a sensory system that provides a current view
of the obstacles surrounding the robot. The considered problem consists
in using the available sensory information in order to generate
real-time motion commands that allow the robot to execute the assigned
task while avoiding all collisions. This can be regarded as a
sensor-based version of the problem called Task-Constrained Motion Planning with Moving Obstacles (TCMP_MO), in turn an extension of
the basic TCMP
problem.
To solve the above problem, we propose in [1] a real-time planner based
on model predictive control (MPC). In our approach, MPC is used to
compute short-term joint velocities that guarantee task execution while
optimizing an objective function that combines two integral terms
computed along the planning horizon: the first is aimed at keeping the
robot away from the obstacles, while the second penalizes control
effort. In particular, the first term is computed on the basis of the
closest obstacle points O_1,...,O_p to a set of p control points
distributed along the robot structure. Constraints are added so as to
guarantee the kinematic feasibility of the generated motion. As
customary in MPC, only the first control action is actually sent out
for execution, and a new optimization problem will be set up and solved
at the next time instant.
The distances d_1,...,d_p computed by the sensory system are also used
to perform an emergency stop of the robot if any of then goes below a
certain threshold indicating an imminent risk of collision. Results
Below we report some simulation and experimental results obtained by
applying the proposed approach to a UR10 manipulator coexisting with
moving humans. Two Kinect sensors are used to monitor the robot
workspace and to
compute in real time human-robot distances while discriminating actual
collisions from false positives due to camera occlusions; to this end,
we use an extension of the parallel algorithm presented in [2].
Simulations
We simulated the proposed sensor-based planner in C++ as an add-on
for V-REP, a software development kit for robotic simulations. We have
assigned an orientation task to the end-effector of the robot; in
particular, the robot must keep its terminal tool vertical. We use two
control points along the robot structure: one placed at the elbow and
one at the wrist. This is an appropriate choice for the mechanical
structure of the UR10.
The sample rate of the robot controller is 8 msec. The planner requires
only 3 msec to compute velocity inputs over a planning horizon equal to
10 control intervals, while the time needed to run the collision
detection module is about 2,5 msec. The algorithm runs therefore in
real-time with an adequate margin with respect to the controller
sampling time of 8 msec. The clip below shows a simulation where a
human acts as a moving obstacle.
Experiments
The experiment shown in the following clip replicates the
simulation scenario and confirms the reactivity and effectiveness of
the proposed approach.
Documents
[1]
M. Cefalo,
E. Magrini,
G. Oriolo,
Sensor-Based Task-Constrained Motion Planning using Model Predictive Control. Submitted to 12th IFAC Symposium on Robot Control (SYROCO 2018) [pdf].
[2]
M. Cefalo,
E. Magrini,
G. Oriolo,
Parallel Collision Check for Sensor Based Real-Time Motion Planning. 2017 IEEE International Conference on Robotics and Automation (ICRA 2017) [pdf].