This project was made for a 24 hour hacakthon at Pravega 2020,the annual technical fest of Indian Institute of Sciences, Bangalore where it won the 2nd prize.
We have basically made an android app which takes images from the camera, uploads it to a Flask backend where a CNN Classifier model based on VGG16 classifies the image into Hand Gestures. It currently supports two formats the American and Indian Sign Language. The classified label is returned to the android app which is then converted to speech.