Snapchat-like filters, AR lenses, and real-time facial animations.
React-Native wrapper for DeepAR.
Preview |
---|
This GIF taken from DeepAR offical site. |
DeepAR is an infrastructure where you can make AR applications in ease. DeepAR is not free, but you can create applications that can be used by up to 10 people for testing purposes for free.
DeepAR Features:
You can visit DeepAR's offical site to learn more.
DeepAR SDK | lib version (react-native-deepar ) |
Required React Native Version | Android SDK (Min) | iOS Version (Min) |
---|---|---|---|---|
3.4.2 | >= 0.1.0 && <= 0.10.2 |
>= 0.64.2 |
21 | 11.0 |
3.4.4 | >= 0.10.3 && <= 0.10.5 |
>= 0.64.2 |
23 | 11.0 |
5.2.0 | >= 0.11.0 |
>= 0.64.2 |
23 | 11.0 |
:warning: This library under development, if you found a bug, please open an issue from here.
:warning: It only works on physical devices, not will work with simulator.
yarn add react-native-deepar
AndroidManifest.xml
and add the following lines:<uses-permission android:name="android.permission.CAMERA" />
<!-- optionally, if you want to record audio: -->
<uses-permission android:name="android.permission.RECORD_AUDIO" />
minSdkVersion
version minimum 23
, compileSdkVersion
and targetSdkVersion
version minimum 31
from android/build.gradle
file, like below:buildscript {
ext {
buildToolsVersion = "29.0.3"
- minSdkVersion = 20
- compileSdkVersion = 30
- targetSdkVersion = 30
+ minSdkVersion = 23
+ compileSdkVersion = 31
+ targetSdkVersion = 31
}
}
If you're using Proguard, make sure to add rules below:
-keepclassmembers class ai.deepar.ar.DeepAR { *; }
-keepclassmembers class ai.deepar.ar.core.videotexture.VideoTextureAndroidJava { *; }
-keep class ai.deepar.ar.core.videotexture.VideoTextureAndroidJava
Info.plist
and add the following lines:<key>NSCameraUsageDescription</key>
<string>$(PRODUCT_NAME) needs access to your Camera.</string>
<!-- optionally, if you want to record audio: -->
<key>NSMicrophoneUsageDescription</key>
<string>$(PRODUCT_NAME) needs access to your Microphone.</string>
ios/YourProject.xcworkspace
file in Xcode and update your iOS version to 11.0
minimum, like below:Setting iOS Version from Xcode |
---|
Follow steps in the picture. |
DeepAR.xcframework
to Build Phases:Add DeepAR to Build Phases (1) | Add DeepAR to Build Phases (2) |
---|---|
Follow steps in the picture. | Follow steps in the picture. |
Note: Don't forget install Pods for iOS and rebuild your app.
You need to ask necessary permissions before render the DeepAR component.
Simply use the get functions to find out if a user has granted or denied permission before:
import { Camera } from 'react-native-deepar';
// ..
const cameraPermission = await Camera.getCameraPermissionStatus();
const microphonePermission = await Camera.getMicrophonePermissionStatus();
A permission status can have the following values:
authorized
: Your app is authorized to use said permission. Continue with using the <DeepAR>
view.not-determined
: Your app has not yet requested permission from the user. Continue by calling the request functions.denied
: Your app has already requested permissions from the user, but was explicitly denied. You cannot use the request functions again, but you can use the Linking API to redirect the user to the Settings App where he can manually grant the permission.restricted
: (iOS only) Your app cannot use the Camera or Microphone because that functionality has been restricted, possibly due to active restrictions such as parental controls being in place.Use the request functions to prompt the user to give your app permission to use the Camera or Microphone.
import { Camera } from 'react-native-deepar';
// ..
const cameraPermission = await Camera.requestCameraPermission();
const microphonePermission = await Camera.requestMicrophonePermission();
authorized
: Your app is authorized to use said permission. Continue with using the <DeepAR>
view.denied
: The user explicitly denied the permission request alert. You cannot use the request functions again, but you can use the Linking API to redirect the user to the Settings App where he can manually grant the permission.react-native.config.js
like below:module.exports = {
assets: ['./assets/effects'], // <-- example destination
};
{
"scripts": {
+ "asset": "./node_modules/.bin/react-native-copy-asset"
}
}
npm run asset
Note: If you remove an AR model, you can run the same command for unlinking removed asset.
You don't have to install AR models in your app, you can use AR models over Internet.
import RNFetchBlob from 'rn-fetch-blob';
RNFetchBlob.config({
fileCache: true,
})
.fetch('GET', 'http://betacoins.magix.net/public/deepar-filters/8bitHearts')
.then((res) => {
deepARRef?.current?.switchEffectWithPath({
path: res.path(),
slot: 'effect',
});
});
Make registration to DeepAR and get an API key from Developer Panel.
import React, { useRef } from 'react';
import DeepAR, { IDeepARHandle } from 'react-native-deepar';
const App = () => {
const deepARRef = useRef<IDeepARHandle>(null);
return (
<DeepAR
ref={deepARRef}
apiKey="your-api-key"
style={{ flex: 1 }}
onInitialized={() => {
// ..
}}
/>
);
};
The <DeepAR>
component can take a number of inputs to customize it as needed. They are outlined below:
Prop | Type | Default | Required | Description |
---|---|---|---|---|
apiKey | string | undefined | true | Make registration to DeepAR and get an API key from Developer Panel. |
position | CameraPositions | CameraPositions.FRONT | false | Camera position, back and front. You can change in real-time. |
videoWarmup | boolean | false | false | If set to true, changes how startRecording and resumeRecording methods work; startRecording method will no longer start the video recording immediately, instead it triggers onVideoRecordingPrepared event. You can start video recording with resumeRecording after onVideoRecordingPrepared event triggered. Note: Only available in iOS |
These are various events that you can hook into and fire functions on in the component:
Callback | Callback Params | Description |
---|---|---|
onInitialized | - |
Called when the DeepAR is initialized. DeepAR methods should not be called before the initialization is completed. |
onEffectSwitched | (slot: string) | Called when an effect has been switched. |
onScreenshotTaken | (path: string) | Called when the screen capture is finished. |
onVideoRecordingPrepared | - |
Called when the video recording is prepared. Check videoWarmup option to learn more. Note: Only available in iOS |
onVideoRecordingStarted | - |
The start of the video recording process is not synchronous, so this method will be called when the video recording is started. |
onVideoRecordingFinished | (path: string) | Called when the video recording is finished. |
onCameraSwitched | (facing: CameraPositions) | Called when camera switched. |
onFaceVisibilityChanged | (visible: boolean) | Called when the user's face becomes visible or invisible. |
onImageVisibilityChanged | (visible: boolean, gameObject?: string) | Called when a natural image is being tracked and the visibility has changed. |
onError | (text: string, type: ErrorTypes,) | Called when an error occur, like the model path not found or the effect file failed to load. |
These are the various methods.
Method | Params | Description |
---|---|---|
switchEffect | (params: ISwitchEffect) | The method used to switch any effect in the scene. Effects are places in slots. Every slot is identified by its unique name and can hold one effect at any given moment. Every subsequent call to this method removes the effect that was previously displayed in this slot. |
switchEffectWithPath | (params: ISwitchEffectWithPath) | Same as switchEffect but with path. |
fireTrigger | (trigger: string) | This method allows the user to fire a custom animation trigger for model animations from code. To fire a custom trigger, the trigger string must match the custom trigger set in the Studio when creating the effect. |
takeScreenshot | - |
Captures a screenshot of the current screen. When a screenshot is done onScreenshotTaken will be called with a resulting screenshot. |
setTouchMode | (enabled: boolean) | This method enable or disable the detection of touches over DeepAR view, it is necessary if your effect has ability to detect touches. |
Method | Params | Description |
---|---|---|
setFlashOn | (enabled: boolean) | Toggle flash. |
Method | Params | Description |
---|---|---|
startRecording | (params?: IStartRecording) | Starts video recording of the camera preview. |
pauseRecording | - |
Pauses video recording. |
resumeRecording | - |
Resumes video recording after it has been paused with pauseRecording . |
finishRecording | - |
Stops video recording and starts the process of saving the recorded video to the file system. When the file is saved, the method onVideoRecordingFinished will be called. |
setAudioMute | (enabled: boolean) | Mutes/unmutes the audio while video recording. |
For more details about changeParameter API read this article here.
Method | Params | Description |
---|---|---|
changeParameterFloat | (params: IChangeParamaterFloat) | This method allows the user to change the value of blendshape parameters during runtime. |
changeParameterVec4 | (params: IChangeParamaterVec4) | This method is used to change the certain color of a Game Object at runtime. |
changeParameterVec3 | (params: IChangeParamaterVec3) | This method is used to change the transform of a Game Object at runtime, so here you can change the object position, rotation or scale. |
changeParameterBool | (params: IChangeParamaterBool) | Let say you want to put a button in your app that enables or disables Game Object at runtime. (let's say you want your filter character to put their glasses on or take them off) This function helps you to enable/disable the value. |
changeParameterString | (params: IChangeParamaterString) | Change a string parameter on a game object. The most common use for this override is to change blend mode and culling mode properties of a game object. Note: Only available in iOS |
changeParameterTexture | (params: IChangeParamaterTexture) | This method allows the user to load an image and set it as a texture during runtime. This can be useful if you want to leverage our background segmentation feature, and change the background in your filter. |
Method | Params | Description |
---|---|---|
pause | - |
Pauses the rendering. This method will not release any resources and should be used only for temporary pause (e.g. user goes to the next screen). |
resume | - |
Resumes the rendering if it was previously paused, otherwise doesn't do anything. |
setLiveMode | (enabled: boolean) | This is an optimization method and it allows the user to indicate the DeepAR in which mode it should operate. If called with true value, DeepAR will expect a continuous flow of new frames and it will optimize its inner processes for such workload. An example of this is the typical use case of processing the frames from the camera stream. If called with false it will optimize for preserving resources and memory by pausing the rendering after each processed frame. A typical use case for this is when the user needs to process just one image. |
setFaceDetectionSensitivity | (sensitivity: number) | This method allows the user to change face detection sensitivity. The sensitivity parameter can range from 0 to 3, where 0 is the fastest but might not recognize smaller (further away) faces, and 3 is the slowest but will find smaller faces. By default, this parameter is set to 1. |
showStats | (enabled: boolean) | Display debugging stats on screen. |
import { Camera } from 'react-native-deepar';
Method | Params | Returns | Description |
---|---|---|---|
requestCameraPermission | - |
Promise<CameraPermissionRequestResult > |
Shows a "request permission" alert to the user, and resolves with the new camera permission status. If the user has previously blocked the app from using the camera, the alert will not be shown and "denied" will be returned. |
requestMicrophonePermission | - |
Promise<CameraPermissionRequestResult > |
Shows a "request permission" alert to the user, and resolves with the new microphone permission status. If the user has previously blocked the app from using the microphone, the alert will not be shown and "denied" will be returned. |
getCameraPermissionStatus | - |
Promise<CameraPermissionStatus > |
Gets the current Camera Permission Status. Check this before mounting the Camera to ensure the user has permitted the app to use the camera. |
getMicrophonePermissionStatus | - |
Promise<CameraPermissionStatus > |
Gets the current Microphone-Recording Permission Status. Check this before mounting the Camera to ensure the user has permitted the app to use the microphone. |
DeepAR has Background Segmentation feature, with this feature you can change your background in real-time.
Background Segmentation Preview |
---|
This image taken from DeepAR offical site. |
There is a filter called Background
from Free Filter Pack and you can use this filter.
How change background image?
Switch the Background
effect and apply new background image like below:
import { TextureSourceTypes } from 'react-native-deepar';
import RNFetchBlob from 'rn-fetch-blob';
RNFetchBlob.config({})
.fetch('GET', 'https://random.imagecdn.app/450/800')
.then((res) => {
deepARRef?.current?.changeParameterTexture({
gameObject: 'Background',
component: 'MeshRenderer',
parameter: 's_texColor',
type: TextureSourceTypes.BASE64,
value: res.base64(),
});
});
You can detect touches in your effects if the effect customized for detecting touches. For an example; DeepAR has a filter called Face Painting, with this effect you can paint your face in real-time with touches.
For more details about face painting effect read this article here.
Face Painting Preview |
---|
This image taken from DeepAR offical site. |
If you want to be able to detect touches on the screen, you need to use the following code:
// If you switch to the face painting effect, use below code to ability detect touches over DeepAR view
deepARRef?.current?.setTouchMode(true);
// If you not use face painting effect, use below code to disable detecting touches over DeepAR view
deepARRef?.current?.setTouchMode(false);
Would you like to add watermark to your filter? Follow this tutorial: Placing an image as a part of your filter
Ball Face | Background Segmentation |
---|---|
# clone the project
git clone https://github.com/ridvanaltun/react-native-deepar.git
# go into the project
cd react-native-deepar
# make project ready
npm run bootstrap
# go into the example
cd example
# copy environment file and set your api keys (bundle id is com.example.reactnativedeepar)
cp .env.example .env
# run for android
npm run android
# or
# run for ios
npm run ios
Learn more about limitations from features by platform page.
See the contributing guide to learn how to contribute to the repository and the development workflow.
This project exists thanks to all the people who contribute.
This project is licensed under the MIT License - see the LICENSE
file for details.