Robotics@SNU

Research
- Design and Development of
   FLEA

- Omegabot: Inchworm inspired
   robot

- Large deformable morphing
   structure:Flytrap-inspired
   robot

- Wearable robotic hand
- Hands on surgical robot:
   Shared control system
- Situation Understanding for
   Smart Devices

- Wireless Camera Sensor
   Networks Technology

- Mobile Sensor Networks:
   Algorithms and Applications
- Whole-Body Control Framework
    for Humanoid Robot

- Walking Pattern Generation for
   Humanoid Robot

- Robot Hand Control
- Quadruped Robot Control with
   Whole-Body Control Framework

- Human Gait Analysis using
   3D Motion Capture
- Coordination of multiple robots
- Flocking and consensus
- Vision-based guidance and
   navigation

- Online collision avoidance for
   mobile robots

- Wireless sensor network
- Aerial Manipulation
- Haptics/VR
- Autonomous Mobility
- Telerobotics
- Mechanics/Control
- Industrial Control
- Mobile Manipulation
- Simultaneous Visual and
   Inertia Calibration

- Mechanics of Closed Chains
- Motion Optimization via
   Nonlinear Dimension
   Reduction

- Probabilistic Optimal Planning
   Algorithm for Minimum
   upstream Motions

- Automated Plug-In Recharging
   System for Hybrid Electric
   Vehicle
Situation Understanding for Smart Devices


We are witnessing the emergence of smart devices, such as smartphones, smart pads, and smart TVs. For example, using a smartphone, we can organize our schedule, browse the web, and check emails. Beyond these simple tasks, we can find out our current location and get a direction to our destination using the GPS unit inside a smartphone. As GPS-enabled smartphones have made the location-based service possible, it is envisioned that many new useful and previously unavailable services will be possible using the next generation smart devices.

We can provide better services and enable new applications if these devices are aware of the current situation (or context) of users or their surroundings. This project develops algorithms and new smart devices for understanding situations of users and their surroundings.

This project is funded by the Korea Creative Content Agency (KOCCA).

For more information, visit the lab webpage.

jhp9395@robotics.snu.ac.kr, 02-880-7149