equinor / eit-web-ar

MIT License
3 stars 3 forks source link

Epic 5 - Can web AR help humans identify robots and their actions? #10

Closed jonaspetersorensen closed 4 years ago

jonaspetersorensen commented 4 years ago

A human walks into a room and spot a couple of robots doing something. There is no way to interact with the robots directly, and getting too close might be dangerous.

Can the human use its mobile phone to identify the robots and their actions via AR?

Prerequisites

Team "Case 9 AR" is ready

The team must have gained sufficient mastery of technology by completing Epic-1 and (possible) Epic-2. Meaning we will build upon the experience of the previous Epics in order to tackle this.
This level of readiness can always be discussed as we go as it is a weird mix of technical experience, personal confidence and time available.

Team "Case 10 IoT" is ready

This epic is intended as a cooperation with Virtual summer internship 2020/Case 10 Ref platform IoT devices where the Case 10 team will

MVP

Stretch goals if MVP works

  1. Can we enhance the scene with points of interests?
    If the robot has a planned route then draw the route in AR. Or if the robot as an objective then illustrate the objective in AR

  2. Can we enhance the AR experience with audio? (Note: html5 audio api can be a real ¤#"!¤"% to work with at times, tears of bravery is ok)

    • Let each point of interest have it's own sound snippet
    • Provide a simple interaction for how and when the sound should play Example: Use gaze controls to "face-click" the POI
      Audio snippet "KILLALLHUMANS!" should only be available as an easter egg.
       
  3. Can the user interact with the robots? Example: the user can stop and start a specific robot. AR should clearly show that the robot has been stopped.  

jonaspetersorensen commented 4 years ago

Unfortunately team "Case 10" will not be ready in time for us to use their api, so we will move on to a new epic where we look at "multiplayer" scenarios.