- Design and Development of

- Omegabot: Inchworm inspired

- Large deformable morphing

- Wearable robotic hand
- Hands on surgical robot:
   Shared control system
- Situation Understanding for
   Smart Devices

- Wireless Camera Sensor
   Networks Technology

- Mobile Sensor Networks:
   Algorithms and Applications
- Whole-Body Control Framework
    for Humanoid Robot

- Walking Pattern Generation for
   Humanoid Robot

- Robot Hand Control
- Quadruped Robot Control with
   Whole-Body Control Framework

- Human Gait Analysis using
   3D Motion Capture
- Coordination of multiple robots
- Flocking and consensus
- Vision-based guidance and

- Online collision avoidance for
   mobile robots

- Wireless sensor network
- Aerial Manipulation
- Haptics/VR
- Autonomous Mobility
- Telerobotics
- Mechanics/Control
- Industrial Control
- Mobile Manipulation
- Simultaneous Visual and
   Inertia Calibration

- Mechanics of Closed Chains
- Motion Optimization via
   Nonlinear Dimension

- Probabilistic Optimal Planning
   Algorithm for Minimum
   upstream Motions

- Automated Plug-In Recharging
   System for Hybrid Electric
Vision-based guidance and navigation
We perform research on vision-based guidance and navigation, by assisting a conventional sensor suite with visual information. This is applied to autonomous landing, which is a chllenging but important problem for unmanned aerial vehicles(UAVs). Visual servoing algorithms to track an object of interests are integrated into the control loop, so that the dynamic constraints of the UAV can be incorporated.

For more information, visit the lab webpage., 02-880-7149