Robotics@SNU

Research
- Design and Development of
   FLEA

- Omegabot: Inchworm inspired
   robot

- Large deformable morphing
   structure:Flytrap-inspired
   robot

- Wearable robotic hand
- Hands on surgical robot:
   Shared control system
- Situation Understanding for
   Smart Devices

- Wireless Camera Sensor
   Networks Technology

- Mobile Sensor Networks:
   Algorithms and Applications
- Whole-Body Control Framework
    for Humanoid Robot

- Walking Pattern Generation for
   Humanoid Robot

- Robot Hand Control
- Quadruped Robot Control with
   Whole-Body Control Framework

- Human Gait Analysis using
   3D Motion Capture
- Coordination of multiple robots
- Flocking and consensus
- Vision-based guidance and
   navigation

- Online collision avoidance for
   mobile robots

- Wireless sensor network
- Aerial Manipulation
- Haptics/VR
- Autonomous Mobility
- Telerobotics
- Mechanics/Control
- Industrial Control
- Mobile Manipulation
- Simultaneous Visual and
   Inertia Calibration

- Mechanics of Closed Chains
- Motion Optimization via
   Nonlinear Dimension
   Reduction

- Probabilistic Optimal Planning
   Algorithm for Minimum
   upstream Motions

- Automated Plug-In Recharging
   System for Hybrid Electric
   Vehicle
Vision-based guidance and navigation
We perform research on vision-based guidance and navigation, by assisting a conventional sensor suite with visual information. This is applied to autonomous landing, which is a chllenging but important problem for unmanned aerial vehicles(UAVs). Visual servoing algorithms to track an object of interests are integrated into the control loop, so that the dynamic constraints of the UAV can be incorporated.

For more information, visit the lab webpage.

jhp9395@robotics.snu.ac.kr, 02-880-7149