The 3rd Workshop on Observing and Understanding Hands in Action (HANDS 2017) aims at gathering researchers who work on hand detection and hand pose estimation problems and its applications to e.g. AR/VR and assistive car driving. Development of RGB-D sensors and camera miniaturization (wearable cameras, smart phones, ubiquitous computing) have opened the door to a whole new range of technologies and applications which require detecting hands and recognizing hand poses in a variety of scenarios. Most hand tracking data sets and papers have been focused on near-range front-on scenarios, yet there remain many challenges. The community needs to get past this and our goal is to push the boundaries of 3D hand articulation estimation/tracking, and to evaluate a “breadth of applications” including sign language recognition, desktop interaction, egocentric views, object manipulations, far range and over-the-shoulder driver footage.
The 2017 Hands in the Million Challenge on 3D Hand Pose Estimation is being organized with the workshop! Check the challenge webpage here!
NEW!: The paper “Real-Time Hand Tracking Under Occlusion From an Egocentric RGB-D Sensor” by Franziska Mueller; Dushyant Mehta; Oleksandr Sotnychenko; Srinath Sridhar; Dan Casas, and Christian Theobalt won the best poster award!
NEW!: The paper “YOLSE: Egocentric Fingertip Detection from Single RGB Images” by Wenbin Wu; Chenyang Li; Zhuo Cheng; Xin Zhang, and Lianwen Jin won the best paper award!
NEW!: Proceedings available here.
NEW!: Check the leaderboard of the challenge in real time here! More than 10 teams are working hard to win!
Topics of interest
- Hand detection
- Hand pose and gesture recognition
- 3D articulated hand tracking
- Hand modelling and rendering
- Grasping and object manipulation
- Hand activity recognition
- Gesture interfaces
- Egocentric vision systems
- Structured prediction
- Applications of hand pose estimation in AR/VR
- Applications of hand pose estimation in robotics and haptics