Hi! I support a computer vision and machine perception team at Reality Labs / Oculus / Meta developing and shipping egocentric hand-tracking, body-tacking, object-tracking and human understanding technology for augmented and virtual reality. Prior to that I had co-founded a small company, Nimble VR (acquired by Facebook) that built skeletal hand-tracking software. Drop me a line if you're interested in computer vision for input and interaction.
I received my Ph.D. in EECS while working in CSAIL with Jovan Popović. My research interests are in computer graphics, computer vision and human computer interaction.
I attended Carnegie Mellon University as an undergraduate where I worked with Doug James and Jessica Hodgins.
Research
UmeTrack: Unified multi-view end-to-end hand tracking for VR
|
Shangchen Han, Po-chen Wu, Yubo Zhang, Beibei Liu, Linguang Zhang, Zheng Wang, Weiguang Si, Peizhao Zhang, Yujun Cai, Tomas Hodan, Randi Cabezas, Luan Tran, Muzaffer Akbay, Tsz-Ho Yu, Cem Keskin, Robert Wang
to appear in ACM Transactions on Graphics, (Proc. SIGGRAPH Asia 2022)
[website / data / code]
[paper]
|
Neural Correspondence Field for Object Pose Estimation
|
Lin Huang, Tomas Hodan, Lingni Ma, Linguang Zhang, Luan Tran, Christopher Twigg, Po-Chen Wu, Junsong Yuan, Cem Keskin, Robert Wang
in European Conference on Computer Vision (ECCV), 2022
[code]
[paper]
|
Assembly101: A Large-Scale Multi-View Video Dataset for Understanding Procedural Activities
|
Fadime Sener, Dibyadip Chatterjee, Daniel Shelepov, Kun He, Dipika Singhania, Robert Wang, Angela Yao
in IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 2022
[website / data / code]
[paper]
|
DeltaCNN: End-to-End CNN Inference of Sparse Frame Differences in Videos
|
Mathias Parger, Chengcheng Tang, Christopher D. Twigg, Cem Keskin, Robert Wang, Markus Steinberger
in IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 2022
[website / code]
[paper]
|
EM-POSE: 3D Human Pose Estimation from Sparse Electromagnetic Trackers
|
Manuel Kaufmann, Yi Zhao, Chengcheng Tang, Lingling Tao, Christopher Twigg, Jie Song, Robert Wang, Otmar Hilliges
in International Conference on Computer Vision (ICCV), 2021
[website / data / code]
[paper]
|
UNOC: Understanding Occlusion for Embodied Presence in Virtual Reality
|
Mathias Parger, Chengcheng Tang, Yuanlu Xu, Christopher D. Twigg, Lingling Tao, Yijing Li, Robert Wang, Markus Steinberger
in IEEE Transactions on Visualization and Computer Graphics (TVCG), 2021
[paper]
[data]
|
Decoding Surface Touch Typing from Hand-Tracking
|
Mark Richardson, Matt Durasoff, Robert Wang
in ACM User Interface Software and Technology (UIST), 2020
[paper and video]
|
MEgATrack: Monochrome Egocentric Articulated Hand-Tracking for Virtual Reality
|
Shangchen Han, Beibei Liu, Randi Cabezas, Christopher D. Twigg, Peizhao Zhang, Jeff Petkau, Tsz-Ho Yu, Chun-Jung Tai, Muzaffer Akbay, Zheng Wang, Asaf Nitzan, Gang Dong, Yuting Ye, Lingling Tao, Chengde Wan, Robert Wang
in ACM Transactions on Graphics, (Proc. SIGGRAPH 2020), 39(4).
[paper]
[video] |
Lightweight Multi-View 3D Pose Estimation Through Camera-Disentangled Representation
|
Edoardo Remelli, Shangchen Han, Sina Honari, Pascal Fua, Robert Wang
in IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 2020
[paper]
[video] |
Going Deeper in Spiking Neural Networks: VGG and Residual Architectures
|
Abhronil Sengupta, Yuting Ye, Robert Wang, Chiao Liu, Kaushik Roy
in Frontiers in Neuroscience, 2019
[paper]
|
Online Optical Marker-based Hand Tracking with Deep Labels
DodecaPen: Accurate 6DoF Tracking of a Passive Stylus
|
Po-Chen Wu, Robert Wang, Kenrick Kin, Christopher Twigg, Shangchen Han, Ming-Hsuan Yang, Shao-Yi Chien
in ACM User Interface Software and Technology (UIST), 2017
[paper]
[main video]
|
Structural Optimization of 3D Masonry Buildings
|
Emily Whiting, Hijung Shin, Robert Wang, John Ochsendorf and Frédo Durand
in ACM Transactions on Graphics, (Proc. SIGGRAPH Asia 2012), 31(6).
[paper]
[main video]
|
6D Hands: Markerless Hand Tracking for Computer Aided Design
Practical Color-Based Motion Capture
|
Robert Y. Wang, Sylvain Paris, and Jovan Popović
in ACM/Eurographics Symposium on Computer Animation (SCA), 2011
also CSAIL Technical Reports, MIT-CSAIL-TR-2010-044
[paper]
[main video]
[supplementary video]
|
Seafloor Image Compression with Large Tilesize Vector Quantization
|
Chris Murphy, Robert Y. Wang, Hanumant Singh
in Proceedings of IEEE AUV Conference, 2010
[paper]
|
Real-Time Hand-Tracking as a User Input Device
|
Robert Y. Wang
Doctoral Symposium, ACM Symposium on User Interface Software and Technology (UIST), 2008.
[paper]
|
Real-Time Enveloping with Rotational Regression
Mesh Ensemble Motion Graphs: Data-driven Mesh Animation with Constraints
|
Doug L. James, Christopher D. Twigg, Andrew A. Cove, and Robert Y. Wang
in ACM Transactions on Graphics, 26(4), 2007.
[paper]
[project]
|
6D Hands: Markerless Hand-Tracking for Computer Aided Design
For more information and videos, check out the project page.
Side Projects
While in grad school, I also built a few websites using Ruby on Rails.
I wrote RentMonkey in my fourth year, while I was serving on the Graduate Student Council. At the time, there wasn't an efficient way to look for off-campus housing in Cambridge / Boston suitable for MIT students.
RentMonkey was designed to track listings exclusively provided by MIT students, for MIT students. The website authenticates each student visiting the site. The listings on the site are suitable to MIT students because they typically have been occupied by the students who posted them. Furthermore, students can indicate the rent they paid at a residence historically, so that future tenants can negotiate with the landlord with confidence.
Since its launch in Spring 2008, RentMonkey has been used by 4,700 unique MIT students, who have posted over 500 listings and information about 1,600 residences. RentMonkey has become a clearing house for rental listings in the MIT community. (December 2009)
In Spring 2009, I worked with Feng Zhang to create a high quality expression cloning tool for molecular biology. We optimized our site for both private collaboration and publishing. Because EveryVector is completely on the web, users do not need to install or upgrade software, or worry about data backups. This model helps biologists focus on biology rather than IT.
We launched EveryVector in September 2009. Today, over 400 users have created more than 6,000 vectors on our site. The site is gaining users steadily, and we hope that EveryVector will become a platform for molecular biology analysis. (December 2009)
|