emexlabs / WearableIntelligenceSystem

Wearable computing software framework for intelligence augmentation research and applications. Easily build smart glasses apps, relying on built in voice command, speech recognition, computer vision, UI, sensors, smart phone connection, NLP, facial recognition, database, cloud connection, and more. This repo is in beta.
MIT License
110 stars 23 forks source link

Mapping - Map overlay, voice command directions #5

Open CaydenPierce opened 2 years ago

CaydenPierce commented 2 years ago

Pulling out one's phone all the time while navigating a new place is difficult, dangerous, and forces one to stop in their journey. This is made worse if the individual is operating some vehicle (bicycle, car) where they must disengage all together.

The current idea:

  1. User issues a voice command, asking for directions to their desired location. " directions to x"
  2. The system pulls up a map, centered on the current users position, and with the direction path highlighted. The system also output speech to give the user turn-by-turn directions.

Implementation

OsmAnd Mapping app for Android combined with OsmAnd-Api

OsmAnd~ app - https://f-droid.org/en/packages/net.osmand.plus/ (this is like the Google Play OsmAnd+ version, but ~ for devs)

Osm-and-api: https://github.com/osmandapp/osmand-api-demo/ ^this demo app runs the app IN ITSELF - it doesn't start the OsmAnd activity with some arguments (like running a program that takes over) but OsmAnd maps show up directly in the parents app/activity

Use OsmAndApi to call OsmAnd and display OsmAnd maps in the app. Use the "Navigation" -> "Navigate and Search"

For see the folder osmand-api-demo in the osmand-api-demo repo and read the README.md in that folder for instructions on how to use the osmand-api

CaydenPierce commented 2 years ago

Test 1:

CaydenPierce commented 2 years ago

A new piece of intel that makes this more important than before - Vuzix Store has no map apps. Looking around, it seems the only actual consumer facing implementation of maps on everyday smart glasses is Google Maps on Google Glass, which of course is all closed source an not portable.

Let's try to get this going soon, especially considering OsmAnd will most of the work.

Voice command and a tiny touchpad are probably bad interfaces to plotting a route. I have tried exploring a way for this to make sense, and it really doesn't for the immediate term (with limitations in wearable human output).

So a good approach here will be to make a mapping ui on the ASP with OsmAndApi, then, once the user has chosen their route, send it over WebSocket to the ASG, which can then use OsmAndApi to open and start that route locally.