orcasound / orcamap-react

ReactJS version of Orcamap (current version of orcamap project)
MIT License
22 stars 34 forks source link

try to work out the lighting of hydrophones when orcas are heard #43

Open Devesh21700Kumar opened 3 years ago

Devesh21700Kumar commented 3 years ago

Problem description:

When orcas are heard and their coordinates are mapped on the google sheets and then on the google sheets vector layer then the hydrophone around that coordinate should light up

@ivanoats would like to know more about this problem statement and then work out a solution for this that would include a light up hydrophone svg/icon/photo whenever the coordinates of hearing are mapped

DhananjayPurohit commented 3 years ago

Based on my understandings, I think the idea of lightening up hydrophones is more associated with orcas detected by hydrophone (both by user listening to the hydrophone and ML tool). I think implementing a webhook to inform the map of lightening up any hydrophone will hold good in this case. A second thought can be adding a verified entry into the spreadsheet can help the hydrophone on the map to light up. It still needs a discussion @scottveirs @ivanoats.

Devesh21700Kumar commented 3 years ago

Based on my understandings, I think the idea of lightening up hydrophones is more associated with orcas detected by hydrophone (both by user listening to the hydrophone and ML tool). I think implementing a webhook to inform the map of lightening up any hydrophone will hold good in this case. A second thought can be adding a verified entry into the spreadsheet can help the hydrophone on the map to light up. It still needs a discussion @scottveirs @ivanoats.

yes, a verified entry in the spreadheet is what i was also thinking as an approach

ivanoats commented 3 years ago

Does a webhook depend on us moving orcamap to NextJS so that we can add /api routes to our orcamap app?

Devesh21700Kumar commented 3 years ago

Does a webhook depend on us moving orcamap to NextJS so that we can add /api routes to our orcamap app?

i might be wrong here but I think that its would be compatible with both create-react-app and nest and is independent of the migration.

DhananjayPurohit commented 3 years ago

Does a webhook depend on us moving orcamap to NextJS so that we can add /api routes to our orcamap app?

I had never worked before on NextJS, but I think using webhooks after migration to NextJS would be better as we can easily configure an API route into a webhook for accepting HTTP requests on events like changes in spreadsheet or any updates in SSEMMI validated/moderated data or on a button click from @scottveirs for informing about an activated hydrophone(to support the lighting of hydrophones). This will resolve the problem of polling after every 10 seconds for new data and can save many API calls. I would love to tag @aalaydrus (SSEMMI developer) in this discussion to know more on how we can trigger an event from SSEMMI server-side to inform our NextJS app about the updates in validated/moderated data?

Devesh21700Kumar commented 3 years ago

Yes @DhananjayPurohit @ivanoats That's also true and as far as thhe features of apis with NEXTJS goes ,Any file inside the folder pages/api is mapped to /api/* and will be treated as an API endpoint instead of a page. They are server-side only bundles and won't increase our client-side bundle size.

With Next.js, we can leverage its built-in data fetching APIs to format our data and pre-render your site. So even though webhook is independent of the mode used , it will definitely be smoother with nestjs

scottveirs commented 3 years ago

At this stage, I can mainly offer the following current data sources:

Orcamap hydrophone locations could indicate "realtime activity" based on any or all of:

  1. When any human pushes the "I hear something" button within the Orcasound live-listening web app. Those go into a Postgres db on a Heroku instance. One could ask Skander and Paul on Slack about how to access them. I'm not sure if there is API access...

  2. When the real-time inference system built by Microsoft volunteer hackers generates any prediction above some threshold, and/or when an expert/moderator tags the prediction candidate as a true positive. Those candidates and their labels currently live in a Cosmo db in an AI for Orcas Azure account. On Slack, Prakruti or Akash would be the best contacts for learning about API etc access.

  3. When a sighting is reported near any of the hydrophones. The existing route for such visual location data is the Google sheet associated with Orcamap (via Google API) but the SSEMMI decentralized db API is also ready for testing, I think...

scottveirs commented 10 months ago

Assuming this feature is honed a bit to ONLY "light up" Orcasound hydrphone locations when killer whales are being heard (as opposed to other soniferous species, like a humpback whale), then an emerging simpler method might be to leverage the Acaritia API and only pull acoustic detections that have been associated with the SRKW ecotype (and maybe later also the Bigg's ecotype). It could be done with the existing Comment field, but should likely await a revision of the Acartia data scheme.