Closed j-jonathan closed 10 months ago
Interesting observation with the codec! Maybe there is a limit for the android recorder?
Have you tried using a really low bitrate?
Ohhhh, good catch! It actually doesn't depend on the codec. My phone doesn't support bitrates higher than 80 Mbps. Since I didn't set a default bitrate, it was taking the one generated by getDefaultBitRate (android/src/main/java/com/mrousavy/camera/core/RecordingSession.kt).
private fun getDefaultBitRate(): Double {
var baseBitRate = when (size.width * size.height) {
in 0..640 * 480 -> 2.0
in 640 * 480..1280 * 720 -> 5.0
in 1280 * 720..1920 * 1080 -> 10.0
in 1920 * 1080..3840 * 2160 -> 30.0
in 3840 * 2160..7680 * 4320 -> 100.0
else -> 100.0
}
baseBitRate = baseBitRate / 30.0 * (fps ?: 30).toDouble()
if (this.codec == VideoCodec.H265) baseBitRate *= 0.8
return baseBitRate
}
The video resolution of the format i used is 3648 x 2736. So, I had 100 Mbps in h.264, and 80 Mbps in h.265. That's why it worked in h.265 ^^ I tested 80 Mbps in h.264, and it works and 81 Mbps in h.265 doesn't work.
From what I've seen, it may be possible to retrieve it from CamcorderProfile. I need to check that.
Thank you very much for your help :)
Ohh, interesting! I'll try to fix the getDefaultBitRate()
method by using CamcorderProfile
instead of computing it on my own then!
Hey @j-jonathan! I spent some time investigating this and debugging the available CamcorderProfiles
.
I created this PR: https://github.com/mrousavy/react-native-vision-camera/pull/2216 which does two things:
'low'
( 0.8) or 'high'
( 1.2) are passed.Can you please try this PR and see if that now fixes the problem for you?
Would be amazing if this now records where it previously failed to record. Thanks for your help! ❤️
And I also added some more options: https://github.com/mrousavy/react-native-vision-camera/pull/2225
Released both in VisionCamera 3.6.11! :)
If you appreciate my time and dedication in maintaining and improving this project, please consider 💖 sponsoring me on GitHub 💖 so I can keep improving VisionCamera!
Wow, you're so fast! 😊
I just tested. Unfortunately, I encountered another error.
{"cause": {"message": "Error retrieving camcorder profile params", "stacktrace": "java.lang.RuntimeException: Error retrieving camcorder profile params
at android.media.CamcorderProfile.native_get_camcorder_profile(Native Method)
at android.media.CamcorderProfile.get(CamcorderProfile.java:467)
at com.mrousavy.camera.extensions.RecordingSession_getRecommendedBitRateKt.getRecommendedBitRate(RecordingSession+getRecommendedBitRate.kt:48)
at com.mrousavy.camera.core.RecordingSession.getBitRate(RecordingSession.kt:132)
at com.mrousavy.camera.core.RecordingSession.<init>(RecordingSession.kt:37)
at com.mrousavy.camera.core.CameraSession.startRecording(CameraSession.kt:545)
at com.mrousavy.camera.CameraView_RecordVideoKt.startRecording(CameraView+RecordVideo.kt:44)
at com.mrousavy.camera.CameraViewModule$startRecording$1.invokeSuspend(CameraViewModule.kt:91)
at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:33)
at kotlinx.coroutines.DispatchedTask.run(DispatchedTask.kt:106)
at kotlinx.coroutines.scheduling.CoroutineScheduler.runSafely(CoroutineScheduler.kt:570)
at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.executeTask(CoroutineScheduler.kt:750)
at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.runWorker(CoroutineScheduler.kt:677)
at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.run(CoroutineScheduler.kt:664)
This is due to the 'quality' used (returned by findClosestCamcorderProfileQuality). In my case, it returns CamcorderProfile.QUALITY_4KDCI, and my phone only supports up to CamcorderProfile.QUALITY_2160P.
If I set it to CamcorderProfile.QUALITY_2160P, it works :) and the bitrate is correct! ("Created 3648 x 2736 @ 30 FPS H264 MOV LANDSCAPE_RIGHT 40.0 Mbps RecordingSession (with audio)!")
In my case, it returns CamcorderProfile.QUALITY_4KDCI, and my phone only supports up to CamcorderProfile.QUALITY_2160P
That is very weird - your Format is 4k ({ ...other, "videoHeight": 2056, "videoWidth": 3648 }
), then I guess the format is wrong? It shouldn't even be 4k...
It works with this format { ...other, "videoHeight": 2056, "videoWidth": 3648 }
but not with { ...other, "videoHeight": 2736, "videoWidth": 3648 }
.
Yup okay thanks! I guess then I should edit the getAvailableCameraDevices()
function to not return sizes that don't have a CamcorderProfile.
So if your device supports a maximum CamcorderProfile of QUALITY_2160P
- it does not support QUALITY_4KDCI
or higher. Meaning:
QUALITY_2160P
= 3840 * 2160 = 8.294.400 pixelsQUALITY_4KDCI
= 4096 * 2160 = 8.847.370 pixelsWhereas your camera formats are:
QUALITY_2160P
so it's fine)QUALITY_2160P
so we can't record videos with that)So I shouldn't even return the 3648 * 3723 format.
Actually, the format with { "videoHeight": 2736, "videoWidth": 3648 } works fine. I just checked the resolution of the recorded video, and it's correct (2736x3648), so we can keep this format.
However, for some reason, the call to CamcorderProfile.get(cameraIdInt, quality) with quality=QUALITY_4KDCI is not supported by my phone.
Maybe it's possible to call CamcorderProfile.hasProfile in the findClosestCamcorderProfileQuality to get the closest profile supported by the device 🤔
wtf, that's weird. So it says it's not supported, but actually is kinda supported? Maybe it's not supported in full bit-rate?
Just to make sure we understand each other well, I say it's not supported because calling CamcorderProfile.get(cameraIdInt, quality) with quality=QUALITY_4KDCI throws an exception (Error retrieving camcorder profile params ...).
But otherwise, the format ({ "videoHeight": 2736, "videoWidth": 3648 }) is supported up to a bitrate of 80 Mbit/s, like the other formats.
So, I just think it considers this resolution as QUALITY_2160P even if it's not actually the case... For example, the phone doesn't support 4096 * 2160 (but this resolution is not returned in the vision-camera formats, so no issue on this side :))
I am still getting error crashes. On Redmi note 112 device (android 13 ) "react-native-vision-camera":"^3.6.12",
const startRecording = useCallback(() => { if (cameraRef.current) { try { cameraRef.current.stopRecording(); cameraRef.current.startRecording({ flash: useFlash ? 'on' : 'off', videoCodec: 'h264', videoBitRate: 'extra-low', onRecordingError: error => { console.error('Recording failed!', error); // onStoppedRecording(); }, onRecordingFinished: video => { navigation.navigate('photo-stories', { pathSource: video.path, typeSource: 'video', }); }, }); } catch (e) { console.error('failed to start recording!', e, 'camera'); } finally { console.log('endStartRecording'); } } }, [cameraRef, navigation, useFlash]);
Recording failed! {"cause": {"message": "Error retrieving camcorder profile params", "stacktrace": "java.lang.RuntimeException: Error retrieving camcorder profile params at android.media.CamcorderProfile.native_get_camcorder_profile(Native Method) at android.media.CamcorderProfile.get(CamcorderProfile.java:528)
System: OS: macOS 13.3 CPU: (8) arm64 Apple M1 Memory: 81.53 MB / 8.00 GB Shell: 5.9 - /bin/zsh Binaries: Node: 20.9.0 - /usr/local/bin/node Yarn: 1.22.17 - /opt/homebrew/bin/yarn npm: 10.1.0 - /opt/homebrew/bin/npm Watchman: 2023.11.06.00 - /opt/homebrew/bin/watchman Managers: CocoaPods: 1.12.1 - /opt/homebrew/bin/pod SDKs: iOS SDK: Platforms: DriverKit 22.4, iOS 16.4, macOS 13.3, tvOS 16.4, watchOS 9.4 Android SDK: Not Found IDEs: Android Studio: 2021.1 AI-211.7628.21.2111.8193401 Xcode: 14.3/14E222b - /usr/bin/xcodebuild Languages: Java: 11.0.16.1 - /usr/bin/javac npmPackages: @react-native-community/cli: Not Found react: Not Found react-native: Not Found react-native-macos: Not Found npmGlobalPackages: react-native: Not Found
@mrousavy
I think this error might be related to this issue
tested on more than one device and am getting the same error
Am getting
2023-12-03 10:31:11.542 12347-12680 GME ven...amera.provider@2.7-service_64 W [WARN ] camxchinodegme.cpp:4185 ValidateZoomWindow() RealTimeFeatureZSLPreviewRaw_com.qti.node.gme0_cam0 adjCrop (0 0 3072 2880) AdjImageSize (3072 2304)
2023-12-03 10:31:11.542 12347-12680 GME ven...amera.provider@2.7-service_64 W [WARN ] camxchinodegme.cpp:4195 ValidateZoomWindow() RealTimeFeatureZSLPreviewRaw_com.qti.node.gme0_cam0 adjCrop (0 0 3072 2304) AdjImageSize (3072 2304)
2023-12-03 10:31:11.542 12347-12680 GME ven...amera.provider@2.7-service_64 E [ERROR ] camxchinodegme.cpp:6893 GetSensorAppliedCrop() RealTimeFeatureZSLPreviewRaw_com.qti.node.gme0_cam0 Sensor Crop 8192x6144 and sensor full active array is same 8192x6144 (Internal sensor mode is used)
2023-12-03 10:31:11.560 12347-12678 CamX ven...amera.provider@2.7-service_64 W [ WARN][STATS_AF] pdlib.cpp:375 PDLibGetLcrNativePattern() Invalid sensor pattern dimensions width=0, height=0
2023-12-03 10:31:11.561 12347-12678 CamX ven...amera.provider@2.7-service_64 W [ WARN][STATS_AF ] af_core.cpp:579: af_generate_mv_level CID:[0] null pointer
2023-12-03 10:31:11.561 12347-12678 CamX ven...amera.provider@2.7-service_64 W [ WARN][STATS_AF ] af_core_input.cpp:295: af_map_input_params CID:[0] Invalid BG Stats
2023-12-03 10:31:11.564 12347-12675 CamX ven...amera.provider@2.7-service_64 W [ WARN][STATS_AF] pdlib.cpp:375 PDLibGetLcrNativePattern() Invalid sensor pattern dimensions width=0, height=0
2023-12-03 10:31:11.565 12347-12679 CamX ven...amera.provider@2.7-service_64 W [ WARN][STATS_AEC] caecxctrlcorecommon.h:847: MapMultiCamAuxInfo CID:[0] CID:0, Aux Info is not found
2023-12-03 10:31:11.571 12347-12676 GME ven...amera.provider@2.7-service_64 W [WARN ] camxchinodegme.cpp:4185 ValidateZoomWindow() RealTimeFeatureZSLPreviewRaw_com.qti.node.gme0_cam0 adjCrop (0 0 3072 2880) AdjImageSize (3072 2304)
2023-12-03 10:31:11.571 12347-12676 GME ven...amera.provider@2.7-service_64 W [WARN ] camxchinodegme.cpp:4195 ValidateZoomWindow() RealTimeFeatureZSLPreviewRaw_com.qti.node.gme0_cam0 adjCrop (0 0 3072 2304) AdjImageSize (3072 2304)
2023-12-03 10:31:11.571 12347-12676 GME ven...amera.provider@2.7-service_64 E [ERROR ] camxchinodegme.cpp:6893 GetSensorAppliedCrop() RealTimeFeatureZSLPreviewRaw_com.qti.node.gme0_cam0 Sensor Crop 8192x6144 and sensor full active array is same 8192x6144 (Internal sensor mode is used)
2023-12-03 10:31:11.580 7971-7971 CameraView com.app.debug D Finding view 6095...
2023-12-03 10:31:11.580 7971-7971 CameraView com.app.debug D Found view 6095!
2023-12-03 10:31:11.585 7971-8284 libc com.app.debug W Access denied finding property "vendor.camera.aux.logicalCamPackagelist"
2023-12-03 10:31:11.586 7971-8284 CamcorderProfile com.app.debug I Closest matching CamcorderProfile: 8
2023-12-03 10:31:11.589 7971-8612 ReactNativeJS com.app.debug I { userInfo: null,
message: 'An unknown error occurred while trying to start a video recording! Attempt to invoke virtual method \'int android.media.EncoderProfiles$VideoProfile.getBitrate()\' on a null object reference',
cause:
{ stacktrace: 'java.lang.NullPointerException: Attempt to invoke virtual method \'int android.media.EncoderProfiles$VideoProfile.getBitrate()\' on a null object reference\n\tat com.mrousavy.camera.extensions.RecordingSession_getRecommendedBitRateKt.getRecommendedBitRate(RecordingSession+getRecommendedBitRate.kt:37)\n\tat com.mrousavy.camera.core.RecordingSession.getBitRate(RecordingSession.kt:132)\n\tat com.mrousavy.camera.core.RecordingSession.<init>(RecordingSession.kt:37)\n\tat com.mrousavy.camera.core.CameraSession.startRecording(CameraSession.kt:532)\n\tat com.mrousavy.camera.CameraView_RecordVideoKt.startRecording(CameraView+RecordVideo.kt:44)\n\tat com.mrousavy.camera.CameraViewModule$startRecording$1.invokeSuspend(CameraViewModule.kt:91)\n\tat kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:33)\n\tat kotlinx.coroutines.DispatchedTask.run(DispatchedTask.kt:106)\n\tat kotlinx.coroutines.scheduling.CoroutineScheduler.runSafely(CoroutineScheduler.kt:584)\n\tat kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.executeTask(CoroutineScheduler.kt:793)\n\tat kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.runWorker(CoroutineScheduler.kt:697)\n\tat kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.run(CoroutineScheduler.kt:684)\n',
message: 'Attempt to invoke virtual method \'int android.media.EncoderProfiles$VideoProfile.getBitrate()\' on a null object reference' },
code: 'capture/unknown' }
code to reproduce using new screen in Example app
import { View, Text, Button, StyleSheet } from 'react-native'
import React, { useRef, useState } from 'react'
import { useCameraDevice } from '../../src/hooks/useCameraDevice'
import { Camera } from '../../src/Camera'
const TestScreen = () => {
const camera = useRef<Camera>(null)
const device = useCameraDevice('back')
const [isRecording, setIsRecording] = useState(false)
const [startRecordingTime, setStartRecordingTime] = useState(0)
const [duration, setDuration] = useState(0)
const toggleRecording = async () => {
if (camera.current == null) return
if (isRecording) {
setIsRecording(false)
await camera.current.stopRecording().catch(console.log)
return
}
setIsRecording(true)
setStartRecordingTime(new Date().getTime())
camera.current.startRecording({
onRecordingFinished({ path, duration: recordingDuration }) {
if (recordingDuration > 60) return
const video = {
thumbnail: '000000000_0000000000000.jpg',
localPath: path,
mime: 'mp4',
isUploading: true,
}
setIsRecording(false)
},
onRecordingError(e) {
console.log(e)
},
})
}
return (
<View
style={{
flex: 1,
backgroundColor: 'black',
}}>
{device !== undefined ? (
<Camera
onInitialized={() => console.log('inti8')}
ref={camera}
style={{ flex: 1 }}
device={device}
isActive={true}
video={true}
audio={true}
/>
) : (
<Text>no device</Text>
)}
<Button title="Toggle recording" onPress={toggleRecording} />
</View>
)
}
export default TestScreen
@mrousavy @j-jonathan is this issue resolved . I am still getting this issue in my realme 11 pro . In ios it work fine can you please help it is really urgent
`import { View, Text,Image, Platform, KeyboardAvoidingView,Animated, NativeModules, NativeTouchEvent, Dimensions } from 'react-native' import React ,{FC, useCallback, useEffect, useRef, useState} from 'react' import { Modal } from 'react-native' import { CameraRuntimeError, CameraDevice,PhotoFile, useCameraDevice, useCameraFormat, useFrameProcessor, VideoFile ,Camera,TakePhotoOptions, Point} from 'react-native-vision-camera' import { StyleSheet } from 'react-native' import Reanimated, { Extrapolate, interpolate, useAnimatedGestureHandler, useAnimatedProps, useSharedValue } from 'react-native-reanimated' import { TouchableOpacity } from 'react-native' import Colors from '../constant/colors' import CustomInput from './customInput' import PhotoEditor from 'react-native-photo-editor' import CustomText from './customText' import { commonStyle } from '../constant/commonStyle' import RNFS from 'react-native-fs'; import Entypo from 'react-native-vector-icons/Entypo' import Ionicons from 'react-native-vector-icons/Ionicons' import uuid from 'react-native-uuid'; import Feather from 'react-native-vector-icons/Feather' import SimpleBtn from './simpleBtn' import GradientBtn from './gradientBtn' import { uploadStory } from '../services/api' import { usePreferredCameraDevice } from '../assets/hooks/usePreferredCameraDevice' import { CaptureButton } from './CaptureButton' import Video from 'react-native-video' import { showEditor } from 'react-native-video-trim'; import { NativeEventEmitter } from 'react-native' // import VideoTrimmer from './videoTrimer' // import Animated from 'react-native-reanimated' // const VideoTrim = NativeModules.VideoTrim const { RNPhotoEditor } = NativeModules interface CameraComponentProps { navigation?:any, setShowCamera:(param:boolean)=>void showCamera:boolean } const { width, height, scale } = Dimensions.get('window'); const screenAspectRatio = height-150 / width
const generateRandomImagePath = (type?:'photo' | 'video' | '') => {
const randomString =(uuid.v4()).toString();
if(type == 'video'){
return ${RNFS.DocumentDirectoryPath}/${randomString}.mov
;
}else{
return ${RNFS.DocumentDirectoryPath}/${randomString}.jpg
;
}
};
const ReanimatedCamera = Reanimated.createAnimatedComponent(Camera)
Reanimated.addWhitelistedNativeProps({
zoom: true,
})
const CameraComponent:FC
const format = useCameraFormat(device, [
// { fps: targetFps },
{ videoAspectRatio: screenAspectRatio },
{ videoResolution: {width, height:height-150} },
{ photoAspectRatio: screenAspectRatio },
// { photoResolution: 'max' },
])
useEffect(() => {
const imageTargetValue = mode === 'picture' ? 0 : 1;
const videoTargetValue = mode === 'video' ? 0 : 1;
// Animate the text position when mode changes
Animated.parallel([
Animated.timing(imageSlideAnimation, {
toValue: imageTargetValue,
duration: 300,
useNativeDriver: false,
}),
Animated.timing(videoSlideAnimation, {
toValue: videoTargetValue,
duration: 300,
useNativeDriver: false,
}),
]).start();
console.log(mode)
}, [mode]);
const imageStyle = {
margin: 10,
color: 'blue',
left: imageSlideAnimation.interpolate({
inputRange: [0, 1],
outputRange: ['50%', '0%'],
}),
};
const videoStyle = {
margin: 10,
color: 'blue',
left: videoSlideAnimation.interpolate({
inputRange: [0, 1],
outputRange: ['0%', '50%'],
}),
};
useEffect(() => {
const eventEmitter = new NativeEventEmitter(NativeModules.VideoTrim);
const subscription = eventEmitter.addListener('VideoTrim', (event) => {
switch (event.name) {
case 'onShow': {
console.log('onShowListener', event);
SetVideoEditing(true)
break;
}
case 'onHide': {
console.log('onHide', event);
SetVideoEditing(false)
break;
}
case 'onStartTrimming': {
console.log('onStartTrimming', event);
// setEditedImagePath(event.outputPath)
break;
}
case 'onFinishTrimming': {
console.log('onFinishTrimming', event);
const newPath =generateRandomImagePath('video')
setEditedImagePath(newPath);
RNFS.copyFile( event.outputPath, newPath)
setCapturedImage({ ...capturedImage,uri:`file://${newPath}` , path: `file://${newPath}` });
// setCapturedImage((state:any)=>({...state,uri: event.outputPath}))
break;
}
case 'onCancelTrimming': {
console.log('onCancelTrimming', event);
break;
}
case 'onError': {
console.log('onError', event);
break;
}
}
});
return () => {
subscription.remove();
};
}, []);
const EditVedio = async(videoPath:string)=>{
console.log('EditVedio')
const path = await showEditor(videoPath,{
maxDuration:30,
saveToPhoto:false
})
}
const touchII = async (event: NativeTouchEvent) => {
let point: Point = {
x: Math.round(event.pageX - camLocation.x),
y: Math.round(event.pageY - camLocation.y),
};
await camera?.current?.focus(point)
};
const onPress = ()=>{
// const editedImagePath = Math?.random();
// const editedImagePath1 =
// 'file:///storage/emulated/0/Android/data/com.hotspotmeet/files/Pictures/' +
// editedImagePath +
// '.jpg';
// PhotoEditor.Edit({
// path:capturedImage.path,
// // hiddenControls:[ 'save','share'],
// onDone:(path)=>{
// setCapturedImage({...capturedImage,uri:editedImagePath1})
// // setCapturedImage
// console.log(`patthhhh`,path,capturedImage)
// RNFS.copyFile(path, editedImagePath1);
// },
// onCancel:(path)=>{
// console.log('path',path)
// },
// });
// RNPhotoEditor.Edit(
// {path:'file://' +capturedImage},
// (path)=>{
// setCapturedImage(path)
// console.log('path',path)
// },
// (path)=>{
// console.log('path',path)
// },
// )
PhotoEditor.Edit({
hiddenControls:[ 'save','share'],
path: capturedImage.path,
onDone: (path) => {
const newPath =generateRandomImagePath()
setEditedImagePath(newPath);
RNFS.copyFile(path, newPath)
.then(() => {
// const newPath = generateRandomImagePath()
// setEditedImagePath(newPath);
setCapturedImage({ ...capturedImage, uri: `file://${newPath}` });
})
.catch((error) => {
console.error('Error copying file:', error);
});
},
onCancel: (path) => {
console.log('Photo edit cancelled. Path:', path);
},
});
}
const onMediaCaptured = useCallback(
(media: PhotoFile | VideoFile, type: 'photo' | 'video',) => {
// console.log(`Media captured! ${JSON.stringify(media) , type}`)
console.log('media.path',media.path)
const newPath = generateRandomImagePath(type)
setEditedImagePath(newPath);
setMediaType(type)
RNFS.copyFile(media.path, newPath)
setCapturedImage({...media,path: `file://${newPath}`,uri: `file://${newPath}`})
if(type == 'video'){
EditVedio(`file://${newPath}`)
}
},
[navigation],
)
// const capturePic =async ()=>{
// setEditedImagePath(generateRandomImagePath());
// RNFS.copyFile(photo.path, editedImagePath)
// setCapturedImage({...photo,uri: `file://${editedImagePath}`})
// }
const uploadFile = () =>{
let formData = new FormData();
formData.append('stories', {
uri: capturedImage.uri,
name:`storyUpload.jpeg`,
type: "image/jpeg"
});
console.log(formData)
uploadStory(formData)
.then(res=>{
console.log(res)
})
.catch(err=>{
console.log(err)
})
}
const onInitialized = useCallback(() => {
console.log('Camera initialized!')
setIsCameraInitialized(true)
}, [])
const cameraAnimatedProps = useAnimatedProps(() => {
const z = Math.max(Math.min(zoom.value, maxZoom), minZoom)
return {
zoom: z,
}
}, [maxZoom, minZoom, zoom])
const setIsPressingButton = useCallback(
(_isPressingButton: boolean) => {
isPressingButton.value = _isPressingButton
},
[isPressingButton],
)
return (
<Modal
// transparent
animationType="fade"
onRequestClose={() => {
setShowCamera(false)
setCapturedImage(null)
}}
visible={showCamera}
>
<KeyboardAvoidingView behavior={Platform.OS =='ios'? 'padding':'' } className='flex-1'>
{capturedImage ?
<View
style={{
flex:1,
backgroundColor:Colors.white,
// justifyContent:'center',
// borderWidth:1
}}
>
{/* {console.log(capturedImage.uri)} */}
<View style={{
// borderWidth:2,
backgroundColor:Colors.grey,
flex:1
}}>
{console.log('capturedImage',capturedImage)}
{mediaType === 'video' && !VedioEditing ?
<>
<Video
source={ capturedImage}
// currentTime={}
// controls={true}
style={{
height:height-150,
// borderWidth:2,
// aspectRatio:screenAspectRatio,
width:width
}}
// paused={isVideoPaused}
resizeMode="cover"
posterResizeMode="cover"
allowsExternalPlayback={false}
automaticallyWaitsToMinimizeStalling={false}
// disableFocus={true}
repeat={true}
useTextureView={false}
controls={true}
playWhenInactive={true}
ignoreSilentSwitch="ignore"
// onReadyForDisplay={onMediaLoadEnd}
// onLoad={onMediaLoad}
// onError={onMediaLoadError}
/>
{/* <VideoTrimmer
videoUri={capturedImage.uri}
/> */}
</>
:
<Image
style={{
height:height-150,
// aspectRatio:screenAspectRatio,
width:width
}}
// resizeMode='contain'
source={{uri: capturedImage.uri}}/>
}
</View>
<View
style={{
height:150,
// borderWidth:1
padding:10,
justifyContent:'space-between',
paddingBottom:Platform.OS == 'ios' ? 20:10
}}
>
<View >
<CustomInput style={[commonStyle.inputStyle, { borderColor: Colors.borderColor }]} placeholder='Add caption'/>
</View>
<View className='flex-row gap-[15]' >
<View className='flex-1 '>
<SimpleBtn
// onPress={() => { props?.navigation.navigate('newSubscrition')}}
title={"Cancel"}
/>
</View>
<View className='flex-1'>
<GradientBtn
onPress={ () => uploadFile() }
title={ "Post Stasdory" }
/>
</View>
</View>
</View>
<View
style={[
// commonStyle.shadow,
{
position:'absolute',
top:Platform.OS == 'ios' ? 60 : 20,
right:25,
left:25,
paddingHorizontal:10,
borderRadius:99,
justifyContent:'space-between',
flexDirection:'row',
}]}
>
<TouchableOpacity
style={[
commonStyle.backshadowButton,
{
// borderWidth:1,
// backgroundColor:Colors.white
}
]}
onPress={()=>setCapturedImage(null)}
>
<Entypo name='cross' color={Colors.black} size={30}/>
</TouchableOpacity>
{mediaType == 'photo' &&
<TouchableOpacity
onPress={onPress}
style={[
commonStyle.backshadowButton,
// commonStyle.shadow,
// {
// backgroundColor:Colors.white,
// paddingHorizontal:20,
// paddingVertical:5,
// borderRadius:5}
]}
>
<Feather name='edit-2' size={20} />
</TouchableOpacity>}
</View>
</View>
:
<>
<View
style={{
flex:1
}}>
{device &&
<View
style={{
flex:1,
backgroundColor:Colors.white,
justifyContent:'center',
}}
>
<View
style={{
// justifyContent:'center',
// maxHeight:528,
flex:1
}}
>
<ReanimatedCamera
style={[{flex:1,width:'100%'}]}
orientation="portrait"
device={device}
format={format}
isActive={true}
photo={true}
resizeMode={'cover'}
video={true}
ref={camera}
// focusable={true}
onInitialized={onInitialized}
onLayout={event => {
if(isCameraInitialized){
const layout = event.nativeEvent.layout;
setCamLocation({x: layout.x, y: layout.y});
}
}}
onTouchEnd={x => device.supportsFocus && touchII(x.nativeEvent)}
/>
</View>
<View style={{
// position:'',
justifyContent:'center',
// alignItems:'center',
// bottom:100,
width:"100%",
height:150
}}>
{/* <View style={{alignItems:'center',}}>
<TouchableOpacity
style={[commonStyle.shadowButton,{
width:50,
height:50,
backgroundColor:Colors.white,
borderRadius:100,
borderWidth:1
}]}
onPress={capturePic}
>
</TouchableOpacity>
</View> */}
<CaptureButton
style={{
// position: 'absolute',
alignSelf: 'center',
// borderWidth:1
// bottom: SAFE_AREA_PADDING.paddingBottom,
}}
camera={camera}
onMediaCaptured={onMediaCaptured}
cameraZoom={zoom}
minZoom={minZoom}
maxZoom={maxZoom}
flash={flashOn ? 'on':'off'}
enabled={isCameraInitialized}
setIsPressingButton={setIsPressingButton}
/>
</View>
</View>
}
<View
style={[
{
position:'absolute',
top:Platform.OS == 'ios' ? 60 : 20,
right:25,
left:25,
paddingHorizontal:10,
borderRadius:99,
justifyContent:'space-between',
flexDirection:'row',
// borderWidth:2,
// borderColor:'white'
}]}
>
<TouchableOpacity onPress={()=>setShowCamera(false)} style={ commonStyle.backshadowButton}>
<Entypo name='cross' color={Colors.black} size={30}/>
</TouchableOpacity>
<View>
<TouchableOpacity onPress={()=>setFlashon(state=>!state)} style={[
commonStyle.backshadowButton
]}>
{flashOn ?
<Ionicons name='flash' color={Colors.black} size={25}/>
:
<Ionicons name='flash-off' color={Colors.black} size={25}/>
}
</TouchableOpacity>
<TouchableOpacity onPress={onFlipCameraPressed} style={[
commonStyle.backshadowButton,
{marginTop:20}
]}>
<Ionicons name="camera-reverse" color={Colors.black} size={24} />
</TouchableOpacity>
</View>
</View>
</View>
</>
}
</KeyboardAvoidingView>
</Modal>
)
}
export default CameraComponent`
capture Button component ` import React, { useCallback, useMemo, useRef } from 'react' import { Dimensions, StyleSheet, TouchableOpacity, View, ViewProps } from 'react-native' import { PanGestureHandler, PanGestureHandlerGestureEvent, State, TapGestureHandler, TapGestureHandlerStateChangeEvent, } from 'react-native-gesture-handler' import Reanimated, { cancelAnimation, Easing, Extrapolate, interpolate, useAnimatedStyle, withSpring, withTiming, useAnimatedGestureHandler, useSharedValue, withRepeat, } from 'react-native-reanimated' import type { Camera, PhotoFile, TakePhotoOptions, VideoFile } from 'react-native-vision-camera' // import { CAPTURE_BUTTON_SIZE, SCREEN_HEIGHT, SCREEN_WIDTH } from './../Constants' const { width :SCREEN_WIDTH, height:SCREEN_HEIGHT, scale } = Dimensions.get('window'); const PAN_GESTURE_HANDLER_FAIL_X = [SCREEN_WIDTH, SCREEN_HEIGHT] const PAN_GESTURE_HANDLER_ACTIVE_Y = [-2, 2] const CAPTURE_BUTTON_SIZE = 10 const START_RECORDING_DELAY = 200 const BORDER_WIDTH = 10 * 0.1
interface Props extends ViewProps {
camera: React.RefObject<Camera>
onMediaCaptured: (media: PhotoFile | VideoFile, type: 'photo' | 'video') => void
minZoom: number
maxZoom: number
cameraZoom: Reanimated.SharedValue<number>
flash: 'off' | 'on'
enabled: boolean
setIsPressingButton: (isPressingButton: boolean) => void
}
const _CaptureButton: React.FC
//#region Camera Capture
const takePhoto = useCallback(async () => {
try {
if (camera.current == null) throw new Error('Camera ref is null!')
console.log('Taking photo...')
const photo = await camera.current.takePhoto(takePhotoOptions)
onMediaCaptured(photo, 'photo')
} catch (e) {
console.error('Failed to take photo!', e)
}
}, [camera, onMediaCaptured, takePhotoOptions])
const onStoppedRecording = useCallback(() => {
isRecording.current = false
cancelAnimation(recordingProgress)
console.log('stopped recording video!')
}, [recordingProgress])
const stopRecording = useCallback(async (diff : number) => {
try {
if (camera.current == null) throw new Error('Camera ref is null!')
console.log('calling stopRecording()...')
await camera.current.stopRecording()
console.log('called stopRecording()!')
} catch (e) {
console.error('failed to stop recording!', e)
}
}, [camera])
const startRecording = useCallback(() => {
try {
if (camera.current == null) throw new Error('Camera ref is null!')
console.log('calling startRecording()...')
camera.current.startRecording({
flash: flash,
onRecordingError: (error) => {
console.error('Recording failed!', error)
onStoppedRecording()
},
onRecordingFinished: (video) => {
console.log(`Recording successfully finished! `,video)
onMediaCaptured(video, 'video')
onStoppedRecording()
},
})
// TODO: wait until startRecording returns to actually find out if the recording has successfully started
console.log('called startRecording()!')
isRecording.current = true
} catch (e) {
console.error('failed to start recording!', e, 'camera')
}
}, [camera, flash, onMediaCaptured, onStoppedRecording])
//#endregion
//#region Tap handler
const tapHandler = useRef<TapGestureHandler>()
const onHandlerStateChanged = useCallback(
async ({ nativeEvent: event }: TapGestureHandlerStateChangeEvent) => {
console.debug(`state: ${Object.keys(State)[event.state]}`)
switch (event.state) {
case State.BEGAN: {
recordingProgress.value = 0
isPressingButton.value = true
const now = new Date()
pressDownDate.current = now
setTimeout(() => {
if (pressDownDate.current === now) {
startRecording()
}
}, START_RECORDING_DELAY)
setIsPressingButton(true)
return
}
case State.END:
case State.FAILED:
case State.CANCELLED: {
try {
if (pressDownDate.current == null) throw new Error('PressDownDate ref .current was null!')
const now = new Date()
const diff = now.getTime() - pressDownDate.current.getTime()
pressDownDate.current = undefined
if (diff < START_RECORDING_DELAY) {
await takePhoto()
} else {
await stopRecording(diff)
}
} finally {
setTimeout(() => {
isPressingButton.value = false
setIsPressingButton(false)
}, 500)
}
return
}
default:
break
}
},
[isPressingButton, recordingProgress, setIsPressingButton, startRecording, stopRecording, takePhoto],
)
//#endregion
//#region Pan handler
const panHandler = useRef<PanGestureHandler>()
const onPanGestureEvent = useAnimatedGestureHandler<PanGestureHandlerGestureEvent, { offsetY?: number; startY?: number }>({
onStart: (event, context) => {
context.startY = event.absoluteY
const yForFullZoom = context.startY * 0.7
const offsetYForFullZoom = context.startY - yForFullZoom
context.offsetY = interpolate(cameraZoom.value, [minZoom, maxZoom], [0, offsetYForFullZoom], Extrapolate.CLAMP)
},
onActive: (event, context) => {
const offset = context.offsetY ?? 0
const startY = context.startY ?? SCREEN_HEIGHT
const yForFullZoom = startY * 0.7
cameraZoom.value = interpolate(event.absoluteY - offset, [yForFullZoom, startY], [maxZoom, minZoom], Extrapolate.CLAMP)
},
})
//#endregion
const shadowStyle = useAnimatedStyle(
() => ({
transform: [
{
scale: withSpring(isPressingButton.value ? 1 : 0, {
mass: 1,
damping: 35,
stiffness: 300,
}),
},
],
}),
[isPressingButton],
)
const buttonStyle = useAnimatedStyle(() => {
let scale: number
if (enabled) {
if (isPressingButton.value) {
scale = withRepeat(
withSpring(1, {
stiffness: 100,
damping: 1000,
}),
-1,
true,
)
} else {
scale = withSpring(0.9, {
stiffness: 500,
damping: 300,
})
}
} else {
scale = withSpring(0.6, {
stiffness: 500,
damping: 300,
})
}
return {
opacity: withTiming(enabled ? 1 : 0.3, {
duration: 100,
easing: Easing.linear,
}),
transform: [
{
scale: scale,
},
],
}
}, [enabled, isPressingButton])
const onPressOut = useCallback(async () => {
try {
if (pressDownDate.current == null) throw new Error('PressDownDate ref .current was null!');
const now = new Date();
const diff = now.getTime() - pressDownDate.current.getTime();
pressDownDate.current = undefined;
if (diff < START_RECORDING_DELAY) {
await takePhoto();
} else {
await stopRecording(diff);
}
} finally {
setTimeout(() => {
isPressingButton.value = false;
setIsPressingButton(false);
}, 500);
}
}, [isPressingButton, pressDownDate, setIsPressingButton, stopRecording, takePhoto]);
const onPressIn = useCallback(() => {
recordingProgress.value = 0;
isPressingButton.value = true;
const now = new Date();
pressDownDate.current = now;
setTimeout(() => {
console.log(pressDownDate.current , now)
if (pressDownDate.current === now) {
startRecording();
}
}, START_RECORDING_DELAY);
setIsPressingButton(true);
}, [isPressingButton, recordingProgress, setIsPressingButton, startRecording]);
return (
<>
{/* <TapGestureHandler
enabled={enabled}
ref={tapHandler}
onHandlerStateChange={onHandlerStateChanged}
shouldCancelWhenOutside={false}
maxDurationMs={120000}
simultaneousHandlers={panHandler}>
<Reanimated.View {...props} style={[buttonStyle, style]}>
<PanGestureHandler
enabled={enabled}
ref={panHandler}
// failOffsetX={PAN_GESTURE_HANDLER_FAIL_X}
// activeOffsetY={PAN_GESTURE_HANDLER_ACTIVE_Y}
onGestureEvent={onPanGestureEvent}
simultaneousHandlers={tapHandler}>
<Reanimated.View style={styles.flex}>
<Reanimated.View style={[styles.shadow, shadowStyle]} />
<View style={styles.button} />
</Reanimated.View>
</PanGestureHandler>
<View style={styles.button} />
</Reanimated.View>
</TapGestureHandler> */}
<TouchableOpacity
activeOpacity={1}
onPressIn={onPressIn}
onPressOut={onPressOut}
// style={[buttonStyle, style]}
>
{/* <Reanimated.View style={[styles.shadow, shadowStyle]} /> */}
<View style={styles.button} />
</TouchableOpacity>
</>
)
}
export const CaptureButton = React.memo(_CaptureButton)
const styles = StyleSheet.create({
flex: {
flex: 1,
},
shadow: {
position: 'absolute',
width: CAPTURE_BUTTON_SIZE,
height: CAPTURE_BUTTON_SIZE,
borderRadius: CAPTURE_BUTTON_SIZE / 2,
backgroundColor: '#e34077',
},
button: {
width: 70,
height: 70,
borderRadius: 999 / 2,
borderWidth: 1,
borderColor: 'black',
},
})
`
package.json
{ "name": "radarapp03215", "version": "0.0.1", "private": true, "scripts": { "android": "react-native run-android", "ios": "react-native run-ios", "lint": "eslint .", "start": "react-native start", "test": "jest" }, "dependencies": { "@gorhom/bottom-sheet": "^4.5.1", "@invertase/react-native-apple-authentication": "^2.2.2", "@ptomasroos/react-native-multi-slider": "^2.2.2", "@react-native-async-storage/async-storage": "^1.19.3", "@react-native-clipboard/clipboard": "^1.12.1", "@react-native-community/datetimepicker": "^7.6.1", "@react-native-community/geolocation": "^3.1.0", "@react-native-community/push-notification-ios": "^1.11.0", "@react-native-firebase/app": "^18.5.0", "@react-native-firebase/messaging": "^18.6.1", "@react-native-google-signin/google-signin": "^10.0.1", "@react-native-masked-view/masked-view": "^0.3.0", "@react-navigation/bottom-tabs": "^6.5.10", "@react-navigation/native": "^6.1.8", "@react-navigation/stack": "^6.3.18", "@reduxjs/toolkit": "^1.9.6", "@rneui/base": "^0.0.0-edge.2", "@rneui/themed": "^0.0.0-edge.2", "@rnmapbox/maps": "^10.0.15", "axios": "^1.5.1", "formik": "^2.4.5", "i": "^0.3.7", "nativewind": "^2.0.11", "npm": "^10.2.4", "react": "18.2.0", "react-input-emoji": "^5.6.5", "react-native": "^0.72.5", "react-native-country-codes-picker": "^2.3.4", "react-native-draggable": "^3.3.0", "react-native-drop-shadow": "^0.0.9", "react-native-elements": "^3.4.3", "react-native-fbsdk-next": "^12.1.0", "react-native-fs": "^2.20.0", "react-native-geolocation-service": "^5.3.1", "react-native-gesture-handler": "^2.13.1", "react-native-image-crop-picker": "^0.40.1", "react-native-linear-gradient": "^2.8.3", "react-native-mmkv": "^2.11.0", "react-native-modal": "^13.0.1", "react-native-otp-inputs": "^7.4.0", "react-native-otp-textinput": "^1.1.3", "react-native-paper": "^5.11.1", "react-native-permissions": "^3.10.1", "react-native-photo-editor": "^1.0.13", "react-native-progress": "^5.0.1", "react-native-push-notification": "^8.1.1", "react-native-raw-bottom-sheet": "^2.2.0", "react-native-reanimated": "^3.2.0", "react-native-safe-area-context": "^4.7.2", "react-native-screens": "^3.25.0", "react-native-sectioned-multi-select": "^0.10.0", "react-native-select-dropdown": "^3.4.0", "react-native-splash-screen": "^3.3.0", "react-native-story-view": "^0.0.3", "react-native-svg": "^13.14.0", "react-native-svg-transformer": "^1.1.0", "react-native-swiper": "^1.6.0", "react-native-swiper-flatlist": "^3.2.3", "react-native-uuid": "^2.0.1", "react-native-vector-icons": "^10.0.0", "react-native-video": "^5.2.1", "react-native-video-cache-control": "^1.2.2", "react-native-video-trim": "^1.0.8", "react-native-vision-camera": "^3.6.13", "react-redux": "^8.1.2", "redux": "^4.2.1", "rn-fetch-blob": "^0.12.0", "socket.io-client": "^4.7.2", "tailwindcss": "3.3.2", "yup": "^1.3.2" }, "devDependencies": { "@babel/core": "^7.20.0", "@babel/preset-env": "^7.20.0", "@babel/runtime": "^7.20.0", "@react-native/eslint-config": "^0.72.2", "@react-native/metro-config": "^0.72.11", "@tsconfig/react-native": "^3.0.0", "@types/react": "^18.0.24", "@types/react-native-vector-icons": "^6.4.15", "@types/react-native-video": "^5.0.16", "@types/react-test-renderer": "^18.0.0", "babel-jest": "^29.2.1", "eslint": "^8.19.0", "jest": "^29.2.1", "metro-react-native-babel-preset": "0.76.8", "prettier": "^2.4.1", "react-test-renderer": "18.2.0", "typescript": "4.8.4" }, "engines": { "node": ">=16" } }
@mrousavy I think this is because some devices just don't use Camera2 yet, it uses V3. I think we need to add a check for hardware support
We are stuck due to this issue. Would a temporary downgrade to a previous version be a viable solution to ensure functionality? I upgraded from v2 to v3 due to iPhone 15 crash, but facing this camcorder profile issue in v3
@ZakirBangash
Use such settings with low resolution and frame rate, it will work:
const device = useCameraDevice('back');
const format = useCameraFormat(device, [ { videoResolution: { width: 1280, height: 720 } }, { fps: 30 } ])
@ZakirBangash
Use such settings with low resolution and frame rate, it will work:
const device = useCameraDevice('back');
const format = useCameraFormat(device, [ { videoResolution: { width: 1280, height: 720 } }, { fps: 30 } ])
@ZakirBangash Also set pixelFormat: 'native' and you are good to go.
Thank you so much guys 💖 this suggested format works , but only works on android, it didn't work on ios so I only enable the format for Android then
@ZakirBangash
Use such settings with low resolution and frame rate, it will work:
const device = useCameraDevice('back');
const format = useCameraFormat(device, [ { videoResolution: { width: 1280, height: 720 } }, { fps: 30 } ])
Any news? Here this solution did not work.
@rodrigodmpa I installed the latest version of the library and it started working.
@rodrigodmpa Has this issue been resolved on your end? I also encountered the same problem here. And my version is already the latest 3.6.17.
"react-native-vision-camera": "^3.6.17" "react": "18.2.0", "react-native": "0.72.7",
Galaxy S21 SM-G9910 andriod 13
^ Same here. We're at the latest version and still getting this error
Also seeing this bug on VisionCamera 3.7.0 released yesterday
Same problem here. I patched my local copy to only recommend the closest profile supported by my phone and that seemed to resolve my issue. Here is the version I'm using locally:
private fun findClosestCamcorderProfileQuality(resolution: Size): Int {
//First we need to make sure that any profiles we use are supported by this camera:
val allowedProfiles = (CamcorderProfile.QUALITY_QCIF..CamcorderProfile.QUALITY_8KUHD).filter { profile -> CamcorderProfile.hasProfile(profile)}
// Iterate through all available CamcorderProfiles and find the one that matches the closest
val targetResolution = resolution.width * resolution.height
val closestProfile = allowedProfiles.minBy { profile ->
val currentResolution = getResolutionForCamcorderProfileQuality(profile)
return@minBy abs(currentResolution - targetResolution)
}
return closestProfile
}
I get this bug on 3.7.1 for OnePlus
Would've been great if you could've created a PR for this.
Anyways, I created a PR to fix this issue: #2389
Would've been great if you could've created a PR for this.
Anyways, I created a PR to fix this issue: #2389
I have other priorities that are higher than working on contributing back to this project. I did plan on coming back to do a PR once my app issues were resolved. As a common courtesy I added a comment because I found something that seemed to work for me that wasn't thoroughly tested (and I needed to get my production app fixed that had issues with this and other libraries after IOS/Android updates). I'm also not familiar with the API this library was using, I just know how to run a debugger to set a break point on the error and to try some real-time changes to see if I can resolve the issue.
If you want to flame me for not creating a PR, feel free if that makes you feel better. Just know that responding to people in that way will make them less inclined to suggest a solution going forward.
I have other priorities that are higher than working on contributing back to this project. I did plan on coming back to do a PR once my app issues were resolved. As a common courtesy I added a comment because I found something that seemed to work for me that wasn't thoroughly tested (and I needed to get my production app fixed that had issues with this and other libraries after IOS/Android updates). I'm also not familiar with the API this library was using, I just know how to run a debugger to set a break point on the error and to try some real-time changes to see if I can resolve the issue.
If you want to flame me for not creating a PR, feel free if that makes you feel better. Just know that responding to people in that way will make them less inclined to suggest a solution going forward.
wtf? lol
It's always the same - someone creates an issue, wants it fixed "asap" because it hurts their business, then someone posts a local patch in the comments, and the issue goes stale. If you use open-source you are using something someone in the community has put hundreds or thousands of hours blood and sweat into, and then offered it to the public/you for free. To keep that project alive and working for the future you can either sponsor/support the devs/maintainers, or contribute back in some way shape or form - this would've been an ideal example.
I have other priorities that are higher than working on contributing back to this project.
I have other priorities too than to help other people fix issues for their business.
If you want to flame me for not creating a PR, feel free if that makes you feel better. Just know that responding to people in that way will make them less inclined to suggest a solution going forward.
lmao, I simply said "Would've been great ....". If you get hurt over such a suggestion that's not my fault.
Doesn't matter anyways, but I think the PR would've taken 2 minutes longer than writing that comment. I understand that time is valuable, and I understand that you are not required to create a PR, but still - it would've been great.
Anyways, issue is now resolved since that PR is merged, that's all that matters.
What's happening?
When the video resolution is too high, recording does not work unless I use the H.265 codec.
Is it possible to know when to use the H.265 codec?
For example:
Reproduceable Code
Relevant log output
Camera Device
Device
HUAWEI P30 (ELE-L29) with Android 10
VisionCamera Version
3.6.10
Can you reproduce this issue in the VisionCamera Example app?
I didn't try (⚠️ your issue might get ignored & closed if you don't try this)
Additional information