background image
Better Vision through Experimental Manipulation
Paul Fitzpatrick, Giorgio Metta · MIT AI Lab, Humanoid Robotics Group · {paulfitz,pasa}@ai.mit.edu
Acknowledgments:
This work is funded by DARPA as part of the "Natural Tasking of Robots Based on Human Interaction Cues" project under contract n umber DABT 63-00-C-10102
Our Goal
To investigate the development of the association between
visual information and motor commands in the learning,
representation, and understanding of manipulative gestures.
A practical problem
For manipulation, we need to know what parts of the
environment are physically coherent ensembles. This is a
difficult judgement to make from purely visual information, as
illustrated in the figure below.
Locate arm from motion
Use motion signature to
detect arm and filter out
distractors
Learn to predict arm location
Relate arm location to
proprioceptive feedback
Detect contact events
At moment of impact, there is
a characteristic, discontinuous
spread of perceived motion
Typical results
63 consecutive proddings of the cube, illustrating
the frequency and types of error encountered.
1
2
3
5
4
6
Maximum Segmented regions
Optical flow
Segment impacted objects
Differentiate motion of arm
from that of the object to
reveal the object's boundary
1
2
3
4
1. Locate arm from motion
2. Learn to predict arm location
3. Detect contact events
4. Segment impacted objects
1
2
3
4
Edges of
table and
cube overlap
Color of cube
and table are
poorly
separated
Cube has
misleading
surface
pattern
Maybe some
cruel grad-
student
glued the cube
to the table
Problem
Robot fixates
area of
interest
Sweeps area
to discover
object
boundaries
Solution
0
10
20
30
40
50
60
70
80
90
0
0.05
0.1
0.15
0.2
0.25
0.3
0.35
0.4
0.45
0.5
difference between angle of motion and principal axis of object [degrees]
estimated probability of occurrence
0
10
20
30
40
50
60
70
80
90
0
0.05
0.1
0.15
0.2
0.25
0.3
0.35
0.4
0.45
0.5
difference between angle of motion and principal axis of object [degrees]
estimated probability of occurrence
0
10
20
30
40
50
60
70
80
90
0
0.05
0.1
0.15
0.2
0.25
0.3
0.35
0.4
0.45
0.5
difference between angle of motion and principal axis of object [degrees]
estimated probability of occurrence
0
10
20
30
40
50
60
70
80
90
0
0.05
0.1
0.15
0.2
0.25
0.3
0.35
0.4
0.45
0.5
difference between angle of motion and principal axis of object [degrees]
estimated probability of occurrence
Exploring an affordance: objects that roll
Experimentation by robot reveals that certain objects (a bottle,
a toy car) have a preferred direction of motion relative to the
principal axis of their shape. The objects are clustered online.
Bottle, " pointiness"=0.13
Car, "pointiness"=0.07
Ball, "pointiness"=0.02
Cube, "pointiness"=0.03
Rolls at right
angles to
principal axis
Rolls
along
principal axis
Our solution
Use poking and prodding to solve the figure/ground problem
experimentally, in the following steps:
Training phase
Active segmentation
impact
impact
motion
motion
segmentation
segmentation
"side tap"
"back slap"