Natural Eye-based Gaze Interaction

Natural Eye-based Gaze Interaction

Shared Physiological Cues

Shared Physiological Cues

G-SIAR

G-SIAR

KITE-Mobile AR

KITE-Mobile AR

Gestures Library

Gestures Library

User-defined Gestures for AR

User-defined Gestures for AR

Physically-based Interaction in AR

Physically-based Interaction in AR

For the architecture of our gesture interface, we emphasized reusability and integration between components. We divided the architecture into five layers, which are (1) Hardware interface (2) Segmentation/tracking (3) Classification (4) Modeling and (5) Gesture recognition. There are six steps to the hand regions classification method, which are (1) Synthetic hand poses creation (2) Decision trees training using GPU (3) Hand segmentation (4) Decision forest classifier using GPU (5) Post-processing and (6) Estimating joint position.

Step 1 and 2 are performed offline to generate decision trees and step 3 to 6 are executed online to classify the input image in real-time.

Hand Tracking and Gestures Library

 

© 2017 by Thammathip Piumsomboon

  • email
  • Facebook Clean Grey
  • LinkedIn Clean Grey