Short Project Description: AI based System to completely revolutionize the lifestyle of the visually impaired by providing real time analysis of surroundings and helping them to interact with the environment around them
Look around. What do you see?
Your computer flashing Google's I/O live.
Your favorite novel that you've been cherishing upon.
Oh! Look at that friend of yours who is heading right towards you to snatch that tasty burrito from your hand.
RUN!
No. Stop.
Now, close your eyes. What do you see?
DARKNESS.
No computer. No novel. Nothing but darkness.
Spare a moment from your busy life to realize that there are 40 million people in the world whose entire life has been consumed battling this darkness due to visual impairment.
We propose an AI based System to completely revolutionize the lifestyle of the visually impaired by providing real time analysis of surroundings and helping them to interact with the environment around them. The solution shall consist of a wearable with a camera attached to it that will be connected to Raspberry Pi. Voice commands will be sent through Alexa which will then fetch data from the Pi camera, process the data on a cloud server using trained models and respond appropriately with speech by providing information about the environment, helping in commuting by avoiding obstacles and also in reading books, handwritten text, information on webpages out loud on being instructed to do so by the user.
🔦 Any other specific thing you want to highlight?
(More to add during the hackathon)
Alexa for voice interface
Face Recognition of friends/family members
Identification of readable text in the environment
✅ Checklist
Before you post the issue:
[x] You have followed the issue title format.
[x] You have made no other issue for this submission.
[x] You have provided all the information correctly.
ℹ️ Project information
Please complete all applicable.
🔥 Your Pitch
Look around. What do you see? Your computer flashing Google's I/O live. Your favorite novel that you've been cherishing upon. Oh! Look at that friend of yours who is heading right towards you to snatch that tasty burrito from your hand. RUN! No. Stop.
Now, close your eyes. What do you see? DARKNESS. No computer. No novel. Nothing but darkness. Spare a moment from your busy life to realize that there are 40 million people in the world whose entire life has been consumed battling this darkness due to visual impairment.
We propose an AI based System to completely revolutionize the lifestyle of the visually impaired by providing real time analysis of surroundings and helping them to interact with the environment around them. The solution shall consist of a wearable with a camera attached to it that will be connected to Raspberry Pi. Voice commands will be sent through Alexa which will then fetch data from the Pi camera, process the data on a cloud server using trained models and respond appropriately with speech by providing information about the environment, helping in commuting by avoiding obstacles and also in reading books, handwritten text, information on webpages out loud on being instructed to do so by the user.
🔦 Any other specific thing you want to highlight?
(More to add during the hackathon)
✅ Checklist
Before you post the issue: