Home

The 3rd Workshop on Observing and Understanding Hands in Action (HANDS 2017) aims at gathering researchers who work on hand detection and hand pose estimation problems and its applications to e.g. AR/VR and assistive car driving. Development of RGB-D sensors and camera miniaturization (wearable cameras, smart phones, ubiquitous computing) have opened the door to a whole new range of technologies and applications which require detecting hands and recognizing hand poses in a variety of scenarios. Most hand tracking data sets and papers have been focused on near-range front-on scenarios, yet there remain many challenges. The community needs to get past this and our goal is to push the boundaries of 3D hand articulation estimation/tracking, and to evaluate a “breadth of applications” including sign language recognition, desktop interaction, egocentric views, object manipulations, far range and over-the-shoulder driver footage.

The 2017 Hands in the Million Challenge on 3D Hand Pose Estimation is being organized with the workshop! Check the challenge webpage here!

Previous workshop editions: HANDS 2015 and HANDS 2016.

Topics of interest

  • Hand detection
  • Hand pose and gesture recognition
  • 3D articulated hand tracking
  • Hand modelling and rendering
  • Grasping and object manipulation
  • Hand activity recognition
  • Gesture interfaces
  • Egocentric vision systems
  • Structured prediction
  • Applications of hand pose estimation in AR/VR
  • Applications of hand pose estimation in robotics and haptics

Call for papers and extended abstracts: