Checkout these blog for
for example of this package.
Default branch uses Tesseract on iOS and Firebase ML Kit on android. Beside that we have 2 branches
For deciding between which one is better check this blog on Hearbeat by Fritz.ai
$ npm install react-native-text-detector --save
or yarn add react-native-text-detector
Import your tessdata folder (you can download one for your language from Google's Repo OR if that gives an error use THIS REPO as referenced on stack overflow as solution into the root of your project AS A REFERENCED FOLDER (see below). It contains the Tesseract trained data files. You can add your own trained data files here too.
NOTE: This library currently requires the tessdata folder to be linked as a referenced folder instead of a symbolic group. If Tesseract can't find a language file in your own project, it's probably because you created the tessdata folder as a symbolic group instead of a referenced folder. It should look like this if you did it correctly:
Note how the tessdata folder has a blue icon, indicating it was imported as a referenced folder instead of a symbolic group.
-lstdc++
if not already presentios/Podfile
pod 'RNTextDetector', path: '../node_modules/react-native-text-detector/ios'
cd ios && pod install
<your_project>.xcworkspace
to run your appLibraries
➜ Add Files to [your project's name]
node_modules
➜ react-native-text-detector
and add RNTextDetector.xcodeproj
libRNTextDetector.a
to your project's Build Phases
➜ Link Binary With Libraries
Cmd+R
)<This package uses Firebase ML Kit for text recognition on android please make sure you have integrated firebase in your app before started integration of this package. Here is the guide for Firebase integration.
android/app/src/main/java/[...]/MainApplication.java
import com.fetchsky.RNTextDetector.RNTextDetectorPackage;
to the imports at the top of the filenew RNTextDetectorPackage()
to the list returned by the getPackages()
methodAppend the following lines to android/settings.gradle
:
include ':react-native-text-detector'
project(':react-native-text-detector').projectDir = new File(rootProject.projectDir, '../node_modules/react-native-text-detector/android')
Insert the following lines inside the dependencies block in android/app/build.gradle
:
...
dependencies {
implementation 'com.google.firebase:firebase-core:16.0.1'
implementation 'com.google.firebase:firebase-ml-vision:17.0.0'
implementation (project(':react-native-text-detector')) {
exclude group: 'com.google.firebase'
}
}
// Place this line at the end of file
apply plugin: 'com.google.gms.google-services'
// Work around for onesignal-gradle-plugin compatibility
com.google.gms.googleservices.GoogleServicesPlugin.config.disableVersionCheck = true
Insert the following lines inside the dependencies block in android/build.gradle
:
buildscript {
repositories {
google()
...
}
dependencies {
classpath 'com.android.tools.build:gradle:3.0.1'
classpath 'com.google.gms:google-services:4.0.1' // google-services plugin
}
}
/**
*
* This Example uses react-native-camera for getting image
*
*/
import RNTextDetector from "react-native-text-detector";
export class TextDetectionComponent extends PureComponent {
...
detectText = async () => {
try {
const options = {
quality: 0.8,
base64: true,
skipProcessing: true,
};
const { uri } = await this.camera.takePictureAsync(options);
const visionResp = await RNTextDetector.detectFromUri(uri);
console.log('visionResp', visionResp);
} catch (e) {
console.warn(e);
}
};
...
}