[PAST] Model-based Object Pose Estimation and Tracking Using 2D and 3D Visual Information
[PAST] Model-based Object Pose Estimation and Tracking Using 2D and 3D Visual Information 
2013 / 10 / 24 / AM 11:00
Location: 301 - 117
Speaker: Changhyun Choi
Changhyun Choi is currently completing his PhD in Computer Science at the Georgia Institute of Technology (expected February 2014). He is affiliated with the Center for Robotics and Intelligent Machines at Georgia Tech, and has also worked in the Imaging Group of Mitsubishi Electric Research Labs in Cambridge, Massachusetts, at the Imaging Media Research Center at KIST, and at the Intelligent Systems Research Center at Sungkyunkwan University. He holds a BS in information and communication engineering from Sungkyunkwan University. His broad research interests are in visual perception for robotics, with a focus on object recognition and pose estimation, visual tracking, and 3D registration.
As robotic systems are moving from well controlled environments to unstructured environments, they are required to operate in highly dynamic and cluttered scenes. Finding an object, estimating its 6-DOF pose, and tracking the pose over time in these scenes are challenging problems. Although various approaches have tackled these problems for the last several decades, a lot of challenges have not been actively addressed and thus remained as unsolved. In this talk, I will show my Ph.D. research which addresses three of remaining challenges: clutter in background, handling both textured and textureless objects, and object discontinuities during tracking. Given a priori 3D object models, my approaches exploit both photometric and geometric features available from the objects to increase the robustness with respect to background clutter, to recognize both textured and textureless objects, and to resolve the issue of object discontinuities in the midst of tracking. My approaches rely on two major visual sensory inputs, a 2D monocular camera and a 3D RGB-D camera. Various visual object pose estimation and tracking examples will be shown with some applications in robotics.