Robotics@SNU

Research
- Design and Development of
   FLEA

- Omegabot: Inchworm inspired
   robot

- Large deformable morphing
   structure:Flytrap-inspired
   robot

- Wearable robotic hand
- Hands on surgical robot:
   Shared control system
- Situation Understanding for
   Smart Devices

- Wireless Camera Sensor
   Networks Technology

- Mobile Sensor Networks:
   Algorithms and Applications
- Whole-Body Control Framework
    for Humanoid Robot

- Walking Pattern Generation for
   Humanoid Robot

- Robot Hand Control
- Quadruped Robot Control with
   Whole-Body Control Framework

- Human Gait Analysis using
   3D Motion Capture
- Coordination of multiple robots
- Flocking and consensus
- Vision-based guidance and
   navigation

- Online collision avoidance for
   mobile robots

- Wireless sensor network
- Aerial Manipulation
- Haptics/VR
- Autonomous Mobility
- Telerobotics
- Mechanics/Control
- Industrial Control
- Mobile Manipulation
- Simultaneous Visual and
   Inertia Calibration

- Mechanics of Closed Chains
- Motion Optimization via
   Nonlinear Dimension
   Reduction

- Probabilistic Optimal Planning
   Algorithm for Minimum
   upstream Motions

- Automated Plug-In Recharging
   System for Hybrid Electric
   Vehicle
Mobile Manipulation

CIROS ver3.0

Indoor demonstration
We are currently developing the motion planner for the CIROS ver3.0, the mobile manipulation system (see Figure), under development at the Center for Intelligent Robotics, Korea Institute of Science and Technology (KIST). The current prototype consists of a seven degree-of-freedom arm mounted on a holonomic wheeled platform, together with a three-finger hand. The robot is intended to function in human environments, assisting the elderly in various tasks while avoiding collisions with static and moving objects in a dynamic and uncertain environment. Autonomous mobile manipulation systems like the CIROS require the integration of both planning and control at several levels, particularly if they are to work in real-time, relying on noisy sensors in environments with inherent uncertainty.

We are currently generating many human-like tasks using RRT along with compliance control as motion planners. The planner we developed contains not only the basic path generator to desired target, but also some specific task such as moving a tray, wiping tables, opening a refrigerator, opening a bottle, shake hands with human and cut object with knife and so on. All these tasks are adaptable to various environmental changes using information from vision& force-torque sensors. Especially the force-torque information from sensor attached on each wrist can be utilized to compensate the motion or enhance safety.

For more information, visit the lab webpage.

jhp9395@robotics.snu.ac.kr, 02-880-7149