Open albullington opened 4 years ago
Eventually implemented this as so:
RN code:
<INatCamera
filterByTaxonId="47126"
negativeFilter={false}
/>
The property filterByTaxonId
can be any taxon ID you want (string). If null
, this means no filter.
If negativeFilter
is false
-> this means returning only predictions that have ancestors of filterByTaxonId
.
If negativeFilter
is true
-> this means returning only predictions that do not have ancestors of filterByTaxonId
.
This affects all methods/callbacks that return predictions.
Regarding inner implementation (@alexshepard):
The easiest place to add this check-for-ancestors-and-reset-score method - was inside the aggregateScores
method:
When reaching a leaf node (no children), and we save its score into the allScores
hash map - we save it as zero, if it doesn't pass the filtering, otherwise, save the score as normal (this means passing the filter params - filterByTaxonId
and negativeFilter
to the Taxonomy
class).
Set everything up in React Native, and I realized that there might be limitations to using the props filterByTaxonId
and negativeFilter
to pass information to the native camera. The no filter, plant filter, and non-plant filter options appear to be working as expected on the initial camera render, but when I toggle those props in React Native, the new values of filterByTaxonId
and negativeFilter
don't seem to be trickling down to native code. So a user would have to close and reopen the camera for those values to take effect instead of being able to toggle and render the filter onscreen.
I took a look at the react-native-camera
code to see if they have best practices for being able to pass mutable values like zoom, flash, white balance, etc. from React Native to native code, and it looks like they're using these constants that get passed in as nativeProps
such as flashMode={RNCamera.Constants.FlashMode.on}
. Maybe that would do the trick @budowski?
Hi! I see that this feature to allow users to filter by plant and non-plant predictions has been fully implemented for android. I was wondering if there was any reason that this wasn't also implemented for IOS (hardware limitations etc.)?
cc: @ziyadedher
@afridiz1, when we built this our Android developer had time to work on the native implementation side of this and our iOS developer did not, and that's still true. We're currently working on pulling some of this logic out of the native code and into Javascript, which will make it easier to implement features like this on both platforms.
We would like to add a feature to let Seek users filter prediction results by "plants" and "non-plants."
In Seek's React Native code, we will pass a parameter to the
react-native-inat-camera
to specify which filter is selected by the user. There are three options available to users:When the plant or non-plant filter is selected, we should run the model on the full tree, trim the output to only items that have an ancestor of plants or non-plants, and set everything else to 0. This will rebalance the scores so the predictions will still pass the 0.7 threshold.
react-native-inat-camera
should then report the best branch back to the React Native code. If a user has the plant or non-plant filter selected, they should receive filtered results in both thehandleTaxaDetected
callback and thetakePictureAsync
callback.For reference, here is the full spec with meeting notes and design screens.