Open DeluxeOwl opened 11 months ago
I'd just like a modern, up-to-date, simple tutorial using expo that runs out of the box after being cloned and that can capture video. After two weeks, I still can't get anything to build.
Hey - Frame Processor Plugins have nothing to do with native modules or expo, VisionCamera has it's fully custom plugin API built entirely with JSI/C++.
This means, that it does not matter if you use expo or not - you always write a Frame Processor Plugin the same way (create a class, extend FrameProcessorPlugin
, implement your callback
func, then register the plugin).
The "register your plugin" part is the one where you would need to find some point in your app where you need to let VisionCamera know about the Frame Processor Plugin you just created - this can be in the following locations:
AppDelegate
/MainActivity
: When starting your app, just call FrameProcessorPluginRegistry.addPlugin(...)
.VISION_EXPORT_FRAME_PROCESSOR_PLUGIN
macro on iOS, and the native module export on Android - the native module export is required because the react-native CLI only includes .java files if they declare a native module.static {
block where the file gets initialized), and then just add the FrameProcessorPluginRegistry.addPlugin(..)
code there)So this is not directly tied to expo.
I'd just like a modern, up-to-date, simple tutorial
This is what the docs are exactly. They are modern (use the latest APIs incl. Swift and Kotlin code), up-to-date (latest VisionCamera version), and are imo quite simple. Not a lot of native code at all.
After two weeks, I still can't get anything to build
I mean, if you are working on something for your business/company you could've also just contacted me through my agency to receive consulting. We could've built this in 1-5 hours, would've probably been cheaper than spending 2 weeks on this.
Can you tell me which parts in the docs are hard to understand or what isn't working for you? Maybe I can improve the docs in some areas.
Hey @mrousavy ,
Thanks for taking the time to answer, I really appreciate it
I'm thinking about creating a package and publishing it on npm (which is a frame processor plugin) and I don't know where to start exactly.
The docs for iOS and Android start with "Open your project with ...". But what if I want to create a git repo with the plugin and test it locally, then publish it?
I think this plugin is a good example? Especially this line https://github.com/ismaelsousa/vision-camera-ocr/blob/v2/android/build.gradle#L135
Does vision-camera-plugin-builder cover this use case? From what I can see, it only generates the swift and kotlin files
Does it make sense what I'm asking? 😅
I'm thinking about:
implementation project(':react-native-vision-camera')
for Kotlin what I'm looking for, what about swift?)Good point, yea I think the docs can be a bit more precise about that.
Yes all of your steps sound right to me. For iOS you'd say s.dependency "VisionCamera"
in your podspec.
You can also ask that stuff on our community discord
What feature or enhancement are you suggesting?
I think it would be great to have documentation on how to create a frame processor plugin using the Expo Modules API since a lot of users are expo users (including myself).
Right now, looking at the Expo Modules API and the Frame Processor plugin reference can be confusing, since I don't know how to use these two together.
I'm talking about these two:
It would be really useful to have an example of a plugin made using the Expo Modules API. Or maybe a guide on how to combine these two. (bonus - installing a swift/kotlin external library?)
What Platforms whould this feature/enhancement affect?
iOS, Android
Alternatives/Workarounds
-
Additional information