Demo link: https://youtu.be/vNRBuvph-RM
Touchless hand gesture recognition systems are becoming important in mobile user interfaces as they improve user experience and comfort. Various computer vision algorithms have employed color and depth cameras for hand gesture recognition, but robust classification of gestures from different subjects performed under widely varying lighting conditions is still challenging. In this project, I created an android piano application with deep learning embedded to assist with hand gesture recognition. This convolutional neural network approach along with data augmentation combines information from multiple domains for the final prediction. It also performs well on a relatively small training set with more effective training and reduce potential overfitting.
The main goal of this project is to develop an easy to use android musical instrument. Instead of playing on a physical piano, user can simply show various hand gestures to trigger a target tone in this application. This can be considered both as a type of entertainment as well as finger exercises. Users can try to play their own favorite song in this special way very easily in front of their family and friends.