Wenzhen Yuan
  • Home
  • GelSight Projects
  • Other Projects
  • Publications
Picture

Active clothing perception 

The goal of this project is to build a robot system that can autonomously explore the properties of natural clothes. An external Kinect sensor guides the robot to move to the proper positions on the clothing for tactile exploration, and then the robot squeezes the clothing with a GelSight finger. We applied CNN to learn multiple clothing properties from the tactile data. The tactile output was used to improve the robotic exploration as well.
Picture

Deep grasping with vision and touch 

We try to predict the grasping success through both vision and tactile sensing using a deep neural network architecture. We build a dataset of over 9,000 grasping trials on 106 different objects. The experiment results show that incorporating tactile sensors substantially improve grasping performance.
Picture

Associating visual and tactile properties of fabrics 

We try to answer a question: if a robot knows what a fabric feels like, and several possible pictures of what it may look like, can the robot pick up the correct picture from the candidates? We believe both the touch and visual images reveals some physical properties of the fabrics, which can be represented by an embedding. We designed a joint CNN to learn the embeddings and thus connect the fabrics' visual appearance and tactile images.
Picture

Estimating object hardness from touch

We try to estimate object hardness under a loosely controlled condition: either a human tester press GelSight on an object, or an open-loop robot gripper squeeze on the object. In the dynamic process, the soft object will deform, and GelSight measure its geometry and contact force to infer the hardness.
Picture

Slip detection for general objects

Slip detection is important for robot grasping, in the sense of it helps robots to detect the grasp failure and sucure the grasp. In this project, we study the general clues for slip occurance, and ask the robot to re-grasp the object when slip is detected. The methods proved to be effective for a wide category of objects.
Picture

Estimating force and torque from GelSight​ markers

In this project, we pained markers on the GelSight surface and track their movement in the GelSight images. We found that the movement field of the markers is closely related to the contact force. To explicitly measure the force and torque when contacting objects with different shapes, we use Convolutional Neural Networks to learn from a set of data.
  • Home
  • GelSight Projects
    • Hardness Estimation
    • Fabric Perception
    • Force, Shear and Slip
  • Other Projects
  • Publications
✕
  • Hardness Estimation
  • Fabric Perception
  • Force, Shear and Slip