aamba / Aamba

An Android App for visually challenged people. The app guides people to walk with voice feedback or vibration feedback.
MIT License
14 stars 3 forks source link

Gitter

The Aamba Project

An Android App for visually challenged people. The app guides people to walk with voice feedback or vibration feedback.

Motivation

According to World Health Organization, Globally the number of people of all ages visually impaired is estimated to be 285 million, of whom 39 million are blind. People 50 years and older are 82% of all blind. So I came up with an app idea that will help blind people to walk safely indoor as well as outdoor. The app can very well recognize household items and humans. Idea is to detect such objects with your smartphone camera and tell you on which part of the screen(quadrant, 3X3, 2X4) the particular object is. The person will then take appropriate action.

Technology

The app uses Tensorflow lite as backend. The machine learning model used is is MobileNet SSD trained on the famous COCO dataset. You do not have to do anything to download these pretrained models , the gradle script handles it for you.


Plan

We will be adding a simple android app with tensorflow lite backend. This will work as backbone. task list:

Download the app for your android device

https://github.com/aamba/Aamba/tree/master/Data/ReadyToUse-APK

How to build it yourself

Model used

MobileNet SSD

http://storage.googleapis.com/download.tensorflow.org/models/tflite/coco_ssd_mobilenet_v1_1.0_quant_2018_06_29.zip

Documentation

Coming soon. :) You can help with it.

Join th aamba Comunity

Click here to join Gitter

Contibutors

All contributors are welcome. Create Github issues and suggest new features.