Closed Sunfocus closed 7 years ago
Don't record the output of the motion detection filter. Record the output coming directly from the camera. The motion detection filter does internal pixel labeling for frame comparisons, and its output isn't intended to be used outside of the internal calculations it performs.
Thanks for your update. Record the output coming directly from the camera ? how can i do that
I need to capture video of 2-3 seconds after motion detection
//output file movieFileOutput = [AVCaptureMovieFileOutput new];
Float64 TotalSeconds = 3; //Total seconds
int32_t preferredTimeScale = 30; //Frames per second
CMTime maxDuration = CMTimeMakeWithSeconds(TotalSeconds, preferredTimeScale); //<<SET MAX DURATION
movieFileOutput.maxRecordedDuration = maxDuration;
movieFileOutput.minFreeDiskSpaceLimit = 1024 * 1024;
//add output file
if ([videoCamera.captureSession canAddOutput:movieFileOutput]) {
[videoCamera.captureSession addOutput:movieFileOutput];
}
When i try to add my outputSource.Camera screen is white.
I am recording video after detecting motion in front of camera. When i play recorded it looks like a histogram video with contrast black and blue.
(void)setupFilter; {
AVCaptureSession *avSession = [[AVCaptureSession alloc] init];
if ([avSession canSetSessionPreset:AVCaptureSessionPreset1920x1080]==YES) { NSLog(@"FULLHD"); videoCamera = [[GPUImageVideoCamera alloc] initWithSessionPreset:AVCaptureSessionPreset1280x720 cameraPosition:AVCaptureDevicePositionFront]; } else if ([avSession canSetSessionPreset:AVCaptureSessionPreset1280x720]==YES) { NSLog(@"HDREADY"); videoCamera = [[GPUImageVideoCamera alloc] initWithSessionPreset:AVCaptureSessionPreset1280x720 cameraPosition:AVCaptureDevicePositionFront]; }else if([avSession canSetSessionPreset:AVCaptureSessionPreset640x480]==YES){ NSLog(@"Normal"); videoCamera = [[GPUImageVideoCamera alloc] initWithSessionPreset:AVCaptureSessionPreset640x480 cameraPosition:AVCaptureDevicePositionFront]; }
videoCamera.outputImageOrientation = UIDeviceOrientationLandscapeLeft; filter = [[GPUImageMotionDetector alloc] init]; [videoCamera addTarget:filter];
videoCamera.runBenchmark = YES; GPUImageView filterView = (GPUImageView )self.view;
if (filterType == GPUIMAGE_MOTIONDETECTOR) {
} [videoCamera startCameraCapture]; }
-(void)stopRecoding:(GPUImageMovieWriter *)moveWriter{ dispatch_async(dispatch_get_global_queue( DISPATCH_QUEUE_PRIORITY_HIGH, 0), ^{ NSLog(@"Recording Stoped"); [videoCamera setAudioEncodingTarget:nil]; [filter removeTarget:moveWriter]; [moveWriter finishRecording];
}
Help me in this what's wrong in this code ?