https://m.facebook.com/groups/17450909269?view=permalink&id=10155709404754270

After studying for 3 long days I was finally able to understand and deploy a CNN that could understand 10 signs or hand gestures. The model is very simple and it consists of 2 hidden layers and looks very much like the model used in the Tensorflow's website for the MNIST dataset. I hope you will like my work. All suggestions and criticisms are very welcome. Here is the source code https://github.com/EvilPort2/Sign-Language.

I am facing a problem which I have mentioned in the "Recognizing gesture" section of the README. Plz help me if you can. I really need it.
Posted by uniqueone
,