Gesture Detection and Recognition in TV News Videos

Communication using body language is an ancient art form, currently evolving in many fascinating ways. And automatic detection of human body language is becoming an active subject of research due to its application in various vision-based articulated body pose estimation systems such as Markerless motion capture for human-computer interfaces, Robot control, visual surveillance, Human image synthesis. A specific part of this field, gesture recognition, has gained great attention in recent years. Current focuses in the field include emotion recognition from face and hand gesture recognition. Users can use simple gestures to control or interact with devices without physically touching them. Gesture recognition has also benefited in the field of Defence, Home Automation, Automated sign language translation.

This is my Google Summer of Code 2019 Project with the Distributed Little Red Hen Lab.

The aim of this project is to develop an automated system for hand gesture detection and recognition in TV news videos. Given a news video of certain time duration, the automated system should not only detect the hand gestures in the video but also provide a label from among the set of hand-gesture classes. Finally, the system should be incorporated into singularity container for deployment on high-performance clusters (HPC) clusters.

Tools and Libraries

Project Update

The progress made in the project is updated in the blog posts. The blog posts with the updates on the work done are: