Rachel Holladay Research     Publications     CV

Planning Forceful Manipulation

Forceful Manipulation Graphic
Our goal is to enable robots to multi-stage forceful manipulation tasks, where the robot must reason over task, motion and force constraints. While all tasks that involve contact are technically forceful, we refer to forceful manipulation tasks as those where the ability to generate and transmit the necessary forces to objects and their environment is an active limiting factor which has to be considered. We begun by focusing on the constraints, formulating them in the context of a constrained manipulation planning problem. In this context, the sequence of actions was given and the robot needed to reason over choices such as the grasp, poses and paths that satisify the constraints. We then extended this to the more general problem of planning the continuous choices and the sequence of actions, or strategy, by leveraging an existing task and motion planning framework.

Relevant Publications: IROS 2019, Master's Thesis 2019, ICRA 2021
Relevant Videos: Tool Use and further experiments. Planning Forceful Manipulation and further experiments
Collaborators: Tomás Lozano-Pérez, Alberto Rodriguez


In-Hand Manipulation

In-Hand Manipulation Graphic
We explore in-hand regrasping via prehensile pushes, where a robot pushes the grasped object against the environment to autonomously re-orient the object in the hand. We introduce the motion cone, which abstract the algebra of frictional pushing, providing bounds on the set of feasible motions, and characterize which pushes will stick or slip. We demonstrate their use as the dynamic propagation step in a sampling-based planning algorithm for in-hand manipulation.

Relevant Publications: RSS 2018, IJRR 2019
Relevant Videos: Motion Cones
Collaborators: Nikhil Chavan-Dafle, Alberto Rodriguez


Constrained Motion Planning in Task Space

Constrained Planning Graphic
This work focuses on the constraint asking the robot’s end effector (hand) to trace out a shape. Formally, our goal is to produce a configuration space path that closely follows a desired task space path despite the presence of obstacles. Adapting metrics from computational geometry, we show that the discrete Frechet distance metric is an effective and natural tool for capturing the notion of closeness between two paths in task space. We then introduce two algorithmic approaches for efficiently planning with this metric. The first is a trajectory optimization approach that directly optimizes to minimize the Fréchet distance between the produced path and the desired task space path The second approach searches along a configuration space embedding graph of the task space path.

Relevant Publications: IROS 2016, Undergraduate Thesis 2017 RA-L 2019
Relevant Videos: Task Space Path Optimization, CURI demo
Collaborators: Oren Salzman, Siddhartha Srinivasa


Communicating Robotic Intent

Intent Graphic
Through several projects, we have explored how robots can express or hide their intent through motion. The first project investigated on how to generate robotic motion that is deceptive, motion that communicates false information or hides information all together.  We present an analysis of deceptive motion with a  mathematical model that enables the robot to autonomously generate deceptive motion and a study on the implications of deceptive motion for human robot interactions. Following this we focused on enabling robots to use deictic gestures.  We presented a mathematical model for legible pointing and discussed how the robot will sometimes need to trade off efficiency for the sake of clarity. To generalize greatures to the robot's configuration space, we developed RoGuE (Robot Gesture Engine) as a motion planning approach to generating gestures, parameterizing gestures as task-space constraints on robot trajectories and goals. 

Relevant Publications: RSS 2014, RO-MAN 2014, Autonomous Robots 2015, AAAI Spring Symposium 2016
Relevant Videos: Deceptive Robotic Motion, RoGuE
Collaborators: Anca Dragan, Siddhartha Srinivasa



All Rights Reserved 2022. Accessibility.