Inspired by TraceTogether and written in response to COVID19. OpenTrace is an application that can record user data and detect the presence of other BLE handsets and devices
@pzmudzinski @gooooer I had a look at feature/sdk_thin and I think there are issues with that approach both generally and specifically. I think there is a better way to get what we want. What I did was make a fat version of the AAR file that moves all of the functionality that we want to run in the background into the AAR. This has the advantage that all the code is now shared between both native and ReactNative. I've got code in here to fire the intent needed to launch a RNHeadlessJS service, but I'm going to need support from @pzmudzinski to integrate that. I'm able to run your current code, but it's not clear to me how it is currently integrated with the AAR that it's using.
If we don't do this I worry that some important features will be missed in the 'thin' library. For example the 'thin' library doesn't have all the changes I just implemented to tune the signal values across iOS and Android so that the alerting happens at about the same distance. Another example would be the fact that the current 'thin' implementation will end up firing multiple system wide intents per second for each handset present with the app on it. I think this might crash the entire handset, not just the RN app, when the handset is in an environment where there 100 or so handsets that can be scanned. That could easily be the case in a work environment even with social distancing as iOS devices have a considerable broadcast range which we cannot control.
Let me know what you think. Let me know if you want help debugging what I have in a version of the library that uses Headless JS. I think what is here should work if call BLETrace.init(context, true), and the class has the name of the current class.
@pzmudzinski @gooooer I had a look at feature/sdk_thin and I think there are issues with that approach both generally and specifically. I think there is a better way to get what we want. What I did was make a fat version of the AAR file that moves all of the functionality that we want to run in the background into the AAR. This has the advantage that all the code is now shared between both native and ReactNative. I've got code in here to fire the intent needed to launch a RNHeadlessJS service, but I'm going to need support from @pzmudzinski to integrate that. I'm able to run your current code, but it's not clear to me how it is currently integrated with the AAR that it's using.
If we don't do this I worry that some important features will be missed in the 'thin' library. For example the 'thin' library doesn't have all the changes I just implemented to tune the signal values across iOS and Android so that the alerting happens at about the same distance. Another example would be the fact that the current 'thin' implementation will end up firing multiple system wide intents per second for each handset present with the app on it. I think this might crash the entire handset, not just the RN app, when the handset is in an environment where there 100 or so handsets that can be scanned. That could easily be the case in a work environment even with social distancing as iOS devices have a considerable broadcast range which we cannot control.
Let me know what you think. Let me know if you want help debugging what I have in a version of the library that uses Headless JS. I think what is here should work if call BLETrace.init(context, true), and the class has the name of the current class.