An Android App for visually challenged people. The app guides people to walk with voice feedback or vibration feedback.
According to World Health Organization, Globally the number of people of all ages visually impaired is estimated to be 285 million, of whom 39 million are blind. People 50 years and older are 82% of all blind. So I came up with an app idea that will help blind people to walk safely indoor as well as outdoor. The app can very well recognize household items and humans. Idea is to detect such objects with your smartphone camera and tell you on which part of the screen(quadrant, 3X3, 2X4) the particular object is. The person will then take appropriate action.
The app uses Tensorflow lite as backend. The machine learning model used is is MobileNet SSD trained on the famous COCO dataset. You do not have to do anything to download these pretrained models , the gradle script handles it for you.
We will be adding a simple android app with tensorflow lite backend. This will work as backbone. task list:
https://github.com/aamba/Aamba/tree/master/Data/ReadyToUse-APK
MobileNet SSD
Coming soon. :) You can help with it.
All contributors are welcome. Create Github issues and suggest new features.