Closed HADO564 closed 2 weeks ago
Don't see any native code on your repo. Is that the repo for your frame processor plugin? If that is then you are doing it wrong, very wrong.
There's an issue on how to start creating your own frame processor plugin(opened by me, a few days ago), take a look at that and fix your approach.
On a side note, the actual idea of frame processor plugin is that you can pass a frame to native side, process that frame with native (Java/KT/Swift) code and return results to js side, in almost real time.
So if you know nothing about Java/KT/Android then you could find yourself stranded in the middle of no where with no way out.
Yeah the plugin code is mine. Everytime i compile, i get the following error:
`> Could not resolve all task dependencies for configuration ':app:debugCompileClasspath'.
Could not find com.mrousavy:camera-frameprocessor:. Required by: project :app Could not find io.socket:socket.io-client:. Required by: project :app`
I also viewed the issues you created, I'm sorry to say that they created more confusion than providing clarity. I tried messing with the build.gradle files etc, but no luck. I also realized i just pushed the react native code to the repo, not the kotlin/java code for the plugins. I recent push would show the implementation according to the docs provided
Sorry there's a typo in that name. It's called frameprocessors
, not frameprocessor
.
I updated the documentation.
Despite the changes, and following the docs exactly (both automatic and manual creation of plugins), the app crashes at compile time. Im coding in VsCode and using the command line android tools. The app compiles flawlessly if i dont create my own plugin. In the same repo under the "another one" commit are all the files that are being used to create a plugin.
I think the issue is arising from incomplete dependencies as the error states, are you sure we dont need to mention them in the build.gradle files?
Hey - it sounds like you're not very familiar with native development, might be good to get a good foundation of how things work in native Android and iOS first, or at least how the build toolchains work.
I'd recommend you to develop in Android Studio and Xcode, not VSCode.
Android Studio also would've told you that you were probably trying to use frameprocessors
, not frameprocessor
.
It'll give you hints about errors and tell you where you're going wrong.
Fair enough, is there a way to do a workaround instead of doing native development? I would rather very much be able to do this in javascript only. I tried using worklets, but to no avail. Just a pointer in the right direction would help, so that I dont spend hours pulling my hair over issues beyond my skills. Essentially just want to send frames from the frameprocessor to a backend for processing via socket io.
Yea currently there is no alternative because this really isn't straight forward - there would need to be a native plugin that can properly upload those Frames to a server. Also, you probably don't want to upload all Frames as they are to a server, you might want to compress (or even better; encode MP4/MOV) them first, and then upload in batches.
A raw Frame in 4k can be 15 MB, at 60 FPS that's 900 MB per second.
So a native plugin that compresses the Frames, then choose a slower FPS and lower resolution, then upload natively - that should work. Alternatively, if it doesn't need to be in realtime and you want to save even more resources, record in batches and upload one after the other.
A very trivial approach is to do frame.toArrayBuffer()
, copy that into a number[]
, use runOnJS
to call fetch(..) and upload that to your server. Maybe you can find a JS based compression library for that as well, not sure.
But this is going to be REALLY slow.
That's what native plugins are for; they're faster, more efficient, can do stuff like good compression, etc.
Alright, thanks for your help man. I guess I'll have to go with an entire video instead of frame by frame processing. Might add some burden to the backend, but should work. Again, thanks dude.
Question
So as the title says, I am having issues with managing and sending the frames over a socket connection to a server that will process the frames in a flask backend. I tried going through the old issues, and found that one of the solutions is to make your own plugin. So thats just what i tried doing. Followed the documentation available. But now im getting gradle errors, like "XyzFrameProcessorPlugin.kt:3:28 Unresolved reference: frameprocessor". Apparently all of them are these types of errors, starting from MainApplication.kt to the new plugins. I used the automatic plugin builder CLI.
For now, I just want to make a plugin successfully. I don't really care what it does (even if it just returns "cat" as the one in the documentation does), as once it starts functioning, I can adjust its behavior to my own needs.
I could share the repo if that helps. https://github.com/HADO564/LipReading_Frontend_ReactNative/
What I tried
I tried using the steps mentioned in documentation. I dont know much about Java or Kotlin, but still, using the example plugin in the docs that just returns "cat", it throws the error during compile time, when kotlin or java is being compiled (in other words, the plugin). Some urgent help would be great.
VisionCamera Version
4.01
Additional information