mrousavy / react-native-vision-camera

📸 A powerful, high-performance React Native Camera library.
https://react-native-vision-camera.com
MIT License
7.6k stars 1.1k forks source link

✨ Migrate to new-arch (react-native 0.74, Fabric/TurboModules/CodeGen/bridgeless) #2614

Open mrousavy opened 8 months ago

mrousavy commented 8 months ago

What feature or enhancement are you suggesting?

VisionCamera needs to migrate to the new architecture, and use Fabric for the View, and TurboModules + CodeGen for the native methods. Also, this implies compatibility with react-native 0.74, as that currently fails to build. (related to New Arch)

Implementation plan

Since VisionCamera is a bit more complicated, there are a few blockers I have encountered:

  1. ❌ I have both a View (CameraView) and two Modules (CameraModule + DevicesModule) in the codebase. Currently, the create-react-native-library template does not have a template for multiple CodeGen specs (views + modules).
  2. ❌ For the Frame Processor runtime I need access to the jsi::Value/jsi::Function that the user passes to the <Camera> view directly, instead of converting it to a callback within the TurboModules/Native Modules system because the Frame Processor function is a Worklet. This is how this works currently:
  3. ❌ For the Frame Processor runtime I need access to the jsi::Runtime as early as possible (on demand). Currently I get the jsi::Runtime from the RCTCxxBridge/CatalystInstance, which is a kinda private and unsafe API.

And then a few things I wanted to wait for (which aren't blockers) before switching to the new arch are:

  1. I want to pass multiple callbacks to a native function. This was a limitation with the Bridge, but should now work with TurboModules I think? In my case startRecording() takes both onRecordingFinished and onRecordingError callbacks, and also returns a Promise which resolves once the recording has actually been started.
  2. I couldn't find a clear script to trigger CodeGen (something like npx react-native codegen) to generate both iOS and Android specs.
  3. New arch doesn't have first class Swift support as far as I know?
  4. New arch doesn't support custom "hybrid" objects (so e.g. I could return an in-memory UIImage/Image instance in takePhoto(), instead of writing it to a file and returning a string as that's the only supported type)

What Platforms whould this feature/enhancement affect?

iOS, Android

Alternatives/Workarounds

Currently only the renderer interop layer can be used, but there are some issues:

Additional information

Upvote & Fund

Fund with Polar

cipolleschi commented 8 months ago

Hi @mrousavy, thanks for this issue, it is super helpful!

  1. ❌ I have both a View (CameraView) and two Modules (CameraModule + DevicesModule) in the codebase. Currently, the create-react-native-library template does not have a template for multiple CodeGen specs (views + modules).

This should not be a problem. In your codegenConfig field of the vision-camera package, you can specify all as type, and CodeGen shuld generate the native code for both modules and classes. In this case, the jsSrcs field should be a parent folder that contains all the specs. For example:

react-native-camera-module
'-> js
    '-> Module
          '-> NativeCameraModule.js
     '-> NativeComponent
         '-> CameraViewComponent.js 

and you pass just js as folder in the jsSrcs.

  1. ❌ For the Frame Processor runtime I need access to the jsi::Value/jsi::Function that the user passes to the view directly, instead of converting it to a callback within the TurboModules/Native Modules system because the Frame Processor function is a Worklet. This is how this works currently: JS: Camera.tsx (uses findNodeHandle(..) + a global setFrameProcessor(..) func) iOS: VisionCameraProxy.mm (uses UIManager::viewForReactTag to find the view) Android: VisionCameraProxy.kt (uses UIManagerHelper.getUIManager to find the View)
  2. ❌ For the Frame Processor runtime I need access to the jsi::Runtime as early as possible (on demand). Currently I get the jsi::Runtime from the RCTCxxBridge/CatalystInstance, which is a kinda private and unsafe API. iOS: VisionCameraInstaller::install Android: VisionCameraProxy.kt.

I don't have a strong guidance here, but a coleague is working on 3. and perhaps with is work you'll be able to make 2. works as well? Not sure about this.

  1. I want to pass multiple callbacks to a native function. This was a limitation with the Bridge, but should now work with TurboModules I think? In my case startRecording() takes both onRecordingFinished and onRecordingError callbacks, and also returns a Promise which resolves once the recording has actually been started.

I read internally a proposal to make this happen, but I don't have an update on the work here.

  1. I couldn't find a clear script to trigger CodeGen (something like npx react-native codegen) to generate both iOS and Android specs.

In 0.74, we have npx react-native codegen, but you need to call it twice, one for iOS and one for Android. It should be easy to add a

"script" : {
  "run-codegen": "scripts/execute-codegen.sh",
  }

to react-native-vision-camera where you run:

npx react-native codegen <ios params>
npx react-native codegen <android params>

and then you can call

yarn run-codegen
  1. New arch doesn't have first class Swift support as far as I know?

No, and it will not have in the short term. To make it work properly, we need to radically change the ReactCommon file structure and it is a long work which we can't prioritize at the moment.

Anyway, it is possible to work around the issue: you should be able to create a thin Objective-C++ layer that:

  1. implements the module/components specs.
  2. holds a Swift object.
  3. forward all the invocation to the held Swift object.

It would be actually very interesting to try this out in a library that is as complex as yours: we might be able to generalize the wrapper somehow... 🤔

  1. New arch doesn't support custom "hybrid" objects (so e.g. I could return an in-memory UIImage/Image instance in takePhoto(), instead of writing it to a file and returning a string as that's the only supported type)

I think expo manage to achieve something like this for some of their packages. There are ways to do it probably, but I feel that they might depends heavily on your use case.

Currently only the renderer interop layer can be used, but there are some issues: https://github.com/mrousavy/react-native-vision-camera/issues/2613

We did a lot of work on interop layers for 0.74. The issue author is not mentioning which version of React Native they are using. It is likely that we fix them already in 0.74 (or they will be fixed soon).

mrousavy commented 8 months ago

Thanks so much for your detailed answer @cipolleschi! This is really helpful :)

rayronvictor commented 8 months ago

We did a lot of work on interop layers for 0.74. The issue author is not mentioning which version of React Native they are using. It is likely that we fix them already in 0.74 (or they will be fixed soon).

I'm using RN 0.73.2

fabOnReact commented 8 months ago

2) For the Frame Processor runtime I need access to the jsi::Value/jsi::Function that the user passes to the view directly, instead of converting it to a callback within the TurboModules/Native Modules system because the Frame Processor function is a Worklet. This is how this works currently: JS: Camera.tsx (uses findNodeHandle(..) + a global setFrameProcessor(..) func) iOS: VisionCameraProxy.mm (uses UIManager::viewForReactTag to find the view) Android: VisionCameraProxy.kt (uses UIManagerHelper.getUIManager to find the View)

I believe there is an example of implementation in TextInput.js which uses the codegen command setTextAndSelection instead of using UIManager + findNodeHandle.

Codegen Commands replaces setNativeProp, which was used with functionalities like react-native-camera zoom prop. Re-rendering the entire component every time there is a change in zoom, would cause the react-native app to slow down, for this reason we would use setNativeProp({zoom: newZoomValue}).

It is sometimes necessary to make changes directly to a component without using state/props to trigger a re-render of the entire subtree. When using React in the browser for example, you sometimes need to directly modify a DOM node, and the same is true for views in mobile apps. setNativeProps is the React Native equivalent to setting properties directly on a DOM node.

More info here https://github.com/facebook/react-native/pull/42701#issuecomment-1950686053

The codegen command calls FabricRenderer dispatchCommand which uses createFromHostFunction to register a function on the native Android/iOS API. JavaScript can call the function in the native iOS/Android API.

1) Function::createFromHostFunction() registers a new function with methodNam is dispatchCommand

/// A function which has this type can be registered as a function
/// callable from JavaScript using Function::createFromHostFunction().
/// When the function is called, args will point to the arguments, and
/// count will indicate how many arguments are passed.  The function
/// can return a Value to the caller, or throw an exception.  If a C++
/// exception is thrown, a JS Error will be created and thrown into
/// JS; if the C++ exception extends std::exception, the Error's
/// message will be whatever what() returns. Note that it is undefined whether
/// HostFunctions may or may not be called in strict mode; that is `thisVal`
/// can be any value - it will not necessarily be coerced to an object or
/// or set to the global object.

2) The HostFunction for the methodName dispatchCommand is passed as the last argument of createFromHostFunction

https://github.com/facebook/react-native/blob/8317325fb2bf6563a9314431c42c90ff68fb35fb/packages/react-native/ReactCommon/jsi/jsi/jsi.h#L100-L111

  /// Create a function which, when invoked, calls C++ code. If the
  /// function throws an exception, a JS Error will be created and
  /// thrown.
  /// \param name the name property for the function.
  /// \param paramCount the length property for the function, which
  /// may not be the number of arguments the function is passed.

https://github.com/facebook/react-native/blob/8ff05b5a18db85ab699323d1745a5f17cdc8bf6c/packages/react-native/ReactCommon/react/renderer/uimanager/UIManagerBinding.cpp#L603-L618

[uiManager, methodName, paramCount](
    jsi::Runtime& runtime,
    const jsi::Value& /*thisValue*/,
    const jsi::Value* arguments,
    size_t count) -> jsi::Value {
  validateArgumentCount(runtime, methodName, paramCount, count);

  auto shadowNode = shadowNodeFromValue(runtime, arguments[0]);
  if (shadowNode) {
    uiManager->dispatchCommand(
        shadowNode,
        stringFromValue(runtime, arguments[1]),
        commandArgsFromValue(runtime, arguments[2]));
  }
  return jsi::Value::undefined();
});

3) The dispatchCommand HostFunction calls FabricMountingManager::dispatchCommand:

fabOnReact commented 8 months ago

For the Frame Processor runtime I need access to the jsi::Value/jsi::Function that the user passes to the view directly, instead of converting it to a callback within the TurboModules/Native Modules system because the Frame Processor function is a Worklet.

https://react-native-vision-camera.com/docs/guides/frame-processors

If I try to solve the issue without using worklet (we can try to add it later).

function App() {
  const userDefinedFunction = (frame) => {
    const objects = detectObjects(frame)
    console.log(`Detected ${objects.length} objects.`)
  }
  const frameProcessor = useFrameProcessor(userDefinedFunction)

  return (
    <Camera
      {...cameraProps}
      frameProcessor={frameProcessor}
    />
  )
}

1) Register userDefinedFunction using createFromHostFunction (codegen command already does this for you) 2) Java/iOS invoke the callback with the frame as parameter https://github.com/facebook/react-native/blob/57ed0fb30931979742634a1faa9a4d3b5261e50d/packages/react-native/ReactAndroid/src/main/java/com/facebook/react/fabric/FabricUIManager.java#L1048-L1064 3) To implement it as worklet, I would look into other libs like reanimated

https://docs.swmansion.com/react-native-reanimated/docs/2.x/fundamentals/worklets/

https://github.com/software-mansion/react-native-reanimated/blob/6806d3d89983ffacd3b1bebbf4fecf86ba56addb/Common/cpp/ReanimatedRuntime/WorkletRuntimeCollector.h#L12-L15

  // When worklet runtime is created, we inject an instance of this class as a
  // `jsi::HostObject` into the global object. When worklet runtime is
  // terminated, the object is garbage-collected, which runs the C++ destructor.
  // In the destructor, we unregister the worklet runtime from the registry.

Sorry. I'm trying to learn more about this so maybe my comment not clear.

philIip commented 8 months ago

❌ For the Frame Processor runtime I need access to the jsi::Value/jsi::Function that the user passes to the view directly, instead of converting it to a callback within the TurboModules/Native Modules system because the Frame Processor function is a Worklet. This is how this works currently:

hiiiii thanks so much for this post; reading this and the code i'm not totally understanding what the core issue is - is it that the UIManager APIs themselves are not supported / behaving as expected in the new architecture?

another question, i'm assuming that the problem you're implying with the worklet is that it needs to be run on a specific thread, but the callback conversion forces it to always be run on JS thread? the piece that i'm missing here is where the native module block conversion is happening? it seems like everything in your code pointers is pure JSI at the moment and doesn't involve the native module infra.

❌ For the Frame Processor runtime I need access to the jsi::Runtime as early as possible (on demand). Currently I get the jsi::Runtime from the RCTCxxBridge/CatalystInstance, which is a kinda private and unsafe API.

right now we made this backward compatible, but i will be writing some docs on what the future-looking APIs are for this!

mrousavy commented 8 months ago

hiiiii thanks so much for this post; reading this and the code i'm not totally understanding what the core issue is - is it that the UIManager APIs themselves are not supported / behaving as expected in the new architecture?

Hey @philIip thanks for tuning in! 👋 so the idea is that VisionCamera's View uses the RN architecture for almost everything:

But there is one prop which the user passes to the Camera that I cannot use the RN architecture for, and that is the frameProcessor. The reason is that I do some special transformations to that function, and need to get it as the pure jsi::Function on the native side to convert it to a callable Worklet on my native side. If I don't manually get it as a jsi::Function and instead use the bridge for this, it will convert it to a RCTCallback or something, which is not what I want.

This then runs on the Camera Thread and is synchronously invoked with a jsi::HostObject as a parameter.

But this is only one specific prop for the Camera (frameProcessor) the rest should use the RN architecture as normal.

The RN architecture simply doesn't have a mechanism for Worklets, or custom jsi::HostObjects currently, so I need to write that manually in JSI/C++.

another question, i'm assuming that the problem you're implying with the worklet is that it needs to be run on a specific thread, but the callback conversion forces it to always be run on JS thread? the piece that i'm missing here is where the native module block conversion is happening? it seems like everything in your code pointers is pure JSI at the moment and doesn't involve the native module infra.

Exactly, that and that I cannot pass a custom jsi::HostObject ("Frame") to the callback.

This is where I eventually convert the jsi::Function to a normal Objective-C/Swift method:

philIip commented 8 months ago

ok i see, so bear with me just want to make sure i'm getting it right! this worklet scenario is not properly supported in the old or new architecture, but you've figured out a way to make it work doing all this custom JSI stuff.

is it a blocker because this is something you feel strongly should be baked into the framework, or is there actually a gap in the old / new architecture here? i think that's the last part i'm missing

mrousavy commented 8 months ago

No worries! Yes this is something that does not work in old or new arch, and I don't think it even should be part of core. It's very custom stuff.

The reason I thought it was a blocker for me is because the workaround in old arch was using the RCTCxxBridge/CatalystInstance to get the Runtime, and this now no longer works in new arch - so I want to use a Cxx TurboModule to get access to the runtime directly, but I have my doubts that codegen will work in this "hybrid" mode (generate everything for C++, but also bridge 90% of the props over to Java/Swift, while the others remain in C++ JSI)

joacub commented 7 months ago

hi @mrousavy thanks for this great project, im looking for this as we are using the scanner in a ver small application for cuba and the phones in there are very old ones, and i think this will be a really good for the performance of the overall scanner, do you ahve any time on when this will be migrated to the new arch ?, the react-native-workles is a blocker for us to use the new arch

mrousavy commented 7 months ago

Not yet, I asked the Meta guys a question here: https://github.com/reactwg/react-native-new-architecture/discussions/167#discussioncomment-8938596 and I am waiting for a reply - that'll decide how easy it's gonna be to migrate over to new arch.

flexingCode commented 6 months ago

hi guys, which RN version should i to use to use this lib? while is working on the migration :)

necmettindev commented 6 months ago

hi guys, which RN version should i to use to use this lib? while is working on the migration :)

0.73.6

chawkip commented 6 months ago

hi guys, which RN version should i to use to use this lib? while is working on the migration :)

0.73.7

404-html commented 6 months ago

Do you guys plan migrate both, V3 and V4, to the new-arch?

mrousavy commented 6 months ago

It's only me doing the migration - and no, only V4 will be migrated. If you are on V3, I'd recommend to switch over to V4 because there are tons of improvements

coder-xiaomo commented 6 months ago

Thanks for the author's open source project, which has helped me a lot. I would like to know, is it a big workload to migrate v4 to new architecture, and does it take a lot of time and energy?

I am preparing to migrate to react native 0.74.1, but this project does not support the new architecture at present, I would like to know how long will it be appropriate to migrate, such as weeks, months, or even longer?

Thanks again for the author's open source project, which has helped me a lot!

MinskLeo commented 6 months ago

Thanks for author's effort. Would be really good to have react-native-vision-camera on 0.74+.

angelo-hub commented 6 months ago

@mrousavy this sounds like kinda an exciting low level problem to solve, I don't have full context on the C++ code, but i'd love to take a crack at it, if you have any design docs or info you can share on how the global memory address is used and set but is a reference to a C++ variable

mrousavy commented 6 months ago

Would be really good to have react-native-vision-camera on 0.74+

Yea, you can fund the development of this by sponsoring to the Polar pool linked here in the issue. Then we all get it faster.

I would like to know, is it a big workload to migrate v4 to new architecture, and does it take a lot of time and energy?

Yes, because the Frame Processor runtime works a lot with the jsi::Runtime, and all of that needs to be migrated to a CxxTurboModule as the Bridge no longer exists (which is currently used for a lot of APIs). react-native-worklets-core also needs to be migrated, as that's a dependency of VisionCamera.

So I'm guessing if I really dedicate a full week to this, I could hack this out. Once the sponsor pool (Polar, linked here) is full, I can take a stab at this.

mrousavy commented 6 months ago

if you have any design docs or info you can share on how the global memory address is used and set but is a reference to a C++ variable

Thanks - not sure if I understand your question, but this is a thing I'd probably more rather tackle alone as it is quite a big refactor on the native side. Normally I'm happy about every contribution, but in this case I'd more rather build this myself as many things just require deep knowledge about the codebase of this.

Alper-Demir-Eng commented 6 months ago

Just a question, for apps that use VC with the Frame Processor disabled, is it possible to somehow compile successfully under RN 0.74?

mrousavy commented 6 months ago

I don't know. It's pretty losely coupled, but the only part that cannot be losely coupled is the install() function, which uses JSI code that doesn't work in bridgeless. So either way it's probably the best to just migrate everything to new arch.

joacub commented 6 months ago

vision camera currently can be working with react native 0.74.1 um using with that and no issues, the camera is even working super fast, ios buils with no issues and andoird you just need to include the error that android studio shows up and fix including the symbols that android studio suggest to you.

Alper-Demir-Eng commented 6 months ago

@joacub Could you please create a tutorial, I've never built on Android Studio and not sure how to go about it.

joacub commented 6 months ago

@joacub Could you please create a tutorial, I've never built on Android Studio and not sure how to go about it.

Are you using a bare react native or expo managed ?

Alper-Demir-Eng commented 6 months ago

@joacub Bare RN.

joacub commented 6 months ago

Im using a expo so dont know in bare, but if you have a android folder just open that in android studio and build, thats it, I guess in bare rn will be the same

idrisssakhi commented 6 months ago

A small patch for those having a build issue. Not sure if the frame processor will continue working after this patch, waiting for your inputs. But the camera will work, and the build issues will be resolved.

diff --git a/node_modules/react-native-vision-camera/android/src/main/java/com/mrousavy/camera/frameprocessors/VisionCameraProxy.kt b/node_modules/react-native-vision-camera/android/src/main/java/com/mrousavy/camera/frameprocessors/VisionCameraProxy.kt
index d697befe..8de418b0 100644
--- a/node_modules/react-native-vision-camera/android/src/main/java/com/mrousavy/camera/frameprocessors/VisionCameraProxy.kt
+++ b/node_modules/react-native-vision-camera/android/src/main/java/com/mrousavy/camera/frameprocessors/VisionCameraProxy.kt
@@ -7,12 +7,14 @@ import com.facebook.jni.HybridData
 import com.facebook.proguard.annotations.DoNotStrip
 import com.facebook.react.bridge.ReactApplicationContext
 import com.facebook.react.bridge.UiThreadUtil
+import com.facebook.react.common.annotations.FrameworkAPI
 import com.facebook.react.turbomodule.core.CallInvokerHolderImpl
 import com.facebook.react.uimanager.UIManagerHelper
 import com.mrousavy.camera.core.ViewNotFoundError
 import com.mrousavy.camera.react.CameraView
 import java.lang.ref.WeakReference

+@OptIn(FrameworkAPI::class)
 @Suppress("KotlinJniMissingFunction") // we use fbjni.
 class VisionCameraProxy(private val reactContext: ReactApplicationContext) {
   companion object {
tremblerz commented 6 months ago

Thanks @idrisssakhi that worked for me! Also had to add implementation 'com.google.mlkit:barcode-scanning:17.2.0' in build.gradle inside the src folder for my barcode use-case.

nikitapilgrim commented 6 months ago

Im using a expo so dont know in bare, but if you have a android folder just open that in android studio and build, thats it, I guess in bare rn will be the same

can you make patch pls?

joacub commented 6 months ago

Im using a expo so dont know in bare, but if you have a android folder just open that in android studio and build, thats it, I guess in bare rn will be the same

can you make patch pls?

this is the patch:

https://github.com/mrousavy/react-native-vision-camera/issues/2614#issuecomment-2105995734

but frameprocessor and codeScanner together does not work

chawkip commented 6 months ago

A small patch for those having a build issue. Not sure if the frame processor will continue working after this patch, waiting for your inputs. But the camera will work, and the build issues will be resolved.

diff --git a/node_modules/react-native-vision-camera/android/src/main/java/com/mrousavy/camera/frameprocessors/VisionCameraProxy.kt b/node_modules/react-native-vision-camera/android/src/main/java/com/mrousavy/camera/frameprocessors/VisionCameraProxy.kt
index d697befe..8de418b0 100644
--- a/node_modules/react-native-vision-camera/android/src/main/java/com/mrousavy/camera/frameprocessors/VisionCameraProxy.kt
+++ b/node_modules/react-native-vision-camera/android/src/main/java/com/mrousavy/camera/frameprocessors/VisionCameraProxy.kt
@@ -7,12 +7,14 @@ import com.facebook.jni.HybridData
 import com.facebook.proguard.annotations.DoNotStrip
 import com.facebook.react.bridge.ReactApplicationContext
 import com.facebook.react.bridge.UiThreadUtil
+import com.facebook.react.common.annotations.FrameworkAPI
 import com.facebook.react.turbomodule.core.CallInvokerHolderImpl
 import com.facebook.react.uimanager.UIManagerHelper
 import com.mrousavy.camera.core.ViewNotFoundError
 import com.mrousavy.camera.react.CameraView
 import java.lang.ref.WeakReference

+@OptIn(FrameworkAPI::class)
 @Suppress("KotlinJniMissingFunction") // we use fbjni.
 class VisionCameraProxy(private val reactContext: ReactApplicationContext) {
   companion object {

Works like a charm! Thanks 🙏🏻

cjadhav commented 6 months ago

Hi Guys,

What's the tentative plan to release this fix.? 4.0.5 is still missing this fix.

Thanks

mrousavy commented 5 months ago

Hi Guys,

What's the tentative plan to release this fix.? 4.0.5 is still missing this fix.

Thanks

This is not a fix, it's quite a big migration. As for status; I haven't worked on this yet.

Thanks for all sponsors so far!

mrousavy commented 5 months ago

JFYI; I just released react-native-vision-camera 4.3.2 which works on react-native 0.74 on both the old and the new architecture.

It is a temporary workaround, as it still uses the compatibility layer and is not a true TurboModule/Fabric view.

For now this works fine in my tests, but in the future VisionCamera will definitely need to be migrated to a TurboModule/Fabric view, which is what this issue here is about.

joacub commented 5 months ago

JFYI; I just released react-native-vision-camera 4.3.2 which works on react-native 0.74 on both the old and the new architecture.

It is a temporary workaround, as it still uses the compatibility layer and is not a true TurboModule/Fabric view.

For now this works fine in my tests, but in the future VisionCamera will definitely need to be migrated to a TurboModule/Fabric view, which is what this issue here is about.

new arch fails to build in ios:

The Swift pod `VisionCamera` depends upon `react-native-worklets-core`, which does not define modules. To opt into those targets generating module maps (which is necessary to import them from Swift when building as static libraries), you may set `use_modular_headers!` globally in your Podfile, or specify `:modular_headers => true` for particular dependencies.
balazsgerlei commented 4 months ago

JFYI; I just released react-native-vision-camera 4.3.2 which works on react-native 0.74 on both the old and the new architecture.

It is a temporary workaround, as it still uses the compatibility layer and is not a true TurboModule/Fabric view.

For now this works fine in my tests, but in the future VisionCamera will definitely need to be migrated to a TurboModule/Fabric view, which is what this issue here is about.

If I understand correctly, supporting both old and new architectures is only temporary, right? May I ask what prevents keeping support for both?

It's quite painful that no react native Camera library seem to support both architectures and e.g. if one would like to have a sample app for a library that supports both architectures cannot have even a quick qr code scanning in that sample without a library that supports both...

mrousavy commented 4 months ago

If I understand correctly, supporting both old and new architectures is only temporary, right?

Well essentially every library out there only supports both new and old arch temporarily - at some point in the not so distant future, the old bridge/old arch will be completely gone from react native. VisionCamera might just drop support for the old arch sooner than other libraries.

May I ask what prevents keeping support for both?

Because I am maintaining VisionCamera (and a bunch of other libraries) alone and it is already a huge effort. Maintaining multiple architectures makes things twice as complicated, and I choose to keep things simple. If the time comes when I drop old-arch support, you can always stick to an older version of VisionCamera, until you upgrade to new arch as well.

Also, unlike a few other simpler libraries, VisionCamera has a very complex JSI part (Frame Processors, with synchronous access to the Frame object and raw data), which I plan to simplify a lot by moving it to new arch. This makes the entire structure simpler and easier to maintain, but won't be backwards compatible.

And lastly, I am currently prototyping a different custom React Native Framework (basically an alternative to TurboModules), and I might migrate VisionCamera over to that.

The benefit is much better performance, simpler codebase (for the ObjC bridge and also especially for Frame Processors), and custom objects support in takePhoto() or prepareRecording(). More on that soon, maybe I'll post updates on Twitter. But this is really exciting :)

It's quite painful that no react native Camera library seem to support both architectures

What? VisionCamera supports both old and new architecture(*1) in the latest stable release(s).

if one would like to have a sample app for a library that supports both architectures cannot have even a quick qr code scanning in that sample without a library that supports both...

I'm not sure if I understand what you're saying. VisionCamera supports both old and new architecture(*1), and the example app has a QR code scanner embedded in it. Did you try it?

You can also simply enable the new arch flag in the example to build it with the new arch.

balazsgerlei commented 4 months ago

I'm not sure if I understand what you're saying. VisionCamera supports both old and new architecture(*1), and the example app has a QR code scanner embedded in it. Did you try it?

Sure, sorry for not being clear - I meant that except the current release of react-native-vision-camera nothing seem to support both architectures (for QR code reading) and it seems like this will be temporary too.

I have not upgraded to 4.x with a library sample app that I based my use-case example on, but I surely will, but my comment was basically an inquiry on whether this state will remain or would no longer be the case (soon).

Thanks for shedding some light on your feature plans!

mrousavy commented 4 months ago

Well yes I will eventually drop support for old-arch - probably a bit sooner than other libraries, but that is due to the complexity that will be required to maintain both archs. My plan for the new arch (maybe VisionCamera 5.x.x) will be to use the new foundation that I am writing, which is essentially just a bare C++ module that calls into Swift and Kotlin - so it will by design not support the old architecture unless I write a bridging layer for it - which I consider unnecessary if release this in e.g. 4 months. V4 will support old arch (and new arch with the compat layer), V5 then only new arch.

jeanraymonddaher commented 3 months ago

will be awesome once we have this. Thank you !!

zzz08900 commented 3 months ago

Just noticed RN 0.75 came with an official way of accessing jsi::Runtime in TurboModules. Does that benefit RNVC?

Here's the link https://reactnative.dev/blog/2024/08/12/release-0.75

mrousavy commented 3 months ago

Hey - yep they introduced this specifically for modules like react-native-vision-camera, but I likely won't need it anymore because I'm building Nitro Modules now.

jeanraymonddaher commented 3 weeks ago

anytimeline on this ? thanks!

mrousavy commented 3 weeks ago

Update: I recently released Nitro Modules, and I'm currently thinking about my implementation strategy here.

I'll likely use Nitro because VisionCamera has a bunch of native objects it passes around (like frameProcessor={...}, device={...}, or format={...}) as well as instances it creates imperatively (prepareRecordingSession(): RecordingSession, takePhoto(): Image). This is not possible with Turbo/Fabric, but very well supported in Nitro - at least as of right now.

However Nitro does not have first-class view support, and I'd have to somehow bridge that over to a Turbo/Fabric view. (e.g. like this).