ISBX / apprtc-ios

A native iOS video chat app based on WebRTC
BSD 3-Clause "New" or "Revised" License
1.34k stars 411 forks source link

How to get audio & video frames to save a video locally #44

Open theiosdevguy opened 8 years ago

theiosdevguy commented 8 years ago

Is there a way to get audio and video frames? I need to save the video being streamed locally also.

coolwr commented 8 years ago

In the RTCAVFoundationVideoSource.h you'll find a reference to the AVCaptureSession. Using the captureSession property you'll be able to call addOutput to an AVCaptureVideoDataOutput object that would allow you to write to a file to record video. You can do the same with audio.

There are a number of tutorials online related to camera video recording that you should be able to integrate with this WebRTC implementation that uses the above AVFoundation references. I hope that helps.

theiosdevguy commented 8 years ago

Thanks @coolwr . I tried fetching the captureSession from RTCAVFoundationVideoSource.h. But strangely, the code below in createLocalVideoTrack of class ARDAppClient shows different objects of videoSource.

RTCAVFoundationVideoSource *videoSource = [[RTCAVFoundationVideoSource alloc] initWithFactory:_factory constraints:mediaConstraints];

localVideoTrack = [_factory videoTrackWithID:@"ARDAMSv0" source:videoSource];

In the code above, you will notice that localVideoTrack.source is different object than videoSource. This ideally should not be the case. Or please let me know if I am missing something here.

theiosdevguy commented 8 years ago

I somehow hacked the above mentioned problem using KVC. But now the main issue is that I am unable to add output to AVCaptureSession. Reason [_session canAddOutput:self.videoDataOutput] always return false. And also if I change the videoOutput, how will the video stream

wumbo commented 8 years ago

Have you had any luck with this?

saifdj commented 5 years ago

Any updates, did anyone find a way to store the session locally ? @wumbo @theiosdevguy @coolwr

wumbo commented 5 years ago

Yes, have a look at my fork here. In ARTCVideoChatViewController.m you can see that I call [self.localVideoTrack addRenderer:self.videoProcessor];

VideoProcessor is a custom class that implements the RTCVideoRenderer protocol. Its -(void) renderFrame:(RTCI420Frame *)frame method will get called every time there's a new frame.

Here you'll get the frame in RTCI420Frame format which uses the YUV color space. I used OpenCV to convert the frame to a cv::Mat in RGB color space, because I was using it to do some image processing. I also used OpenCV to convert it to a UIImage afterwards.

Obviously this just gives you all the frames as images, not as a video, but I don't imagine it would be too difficult to convert them to a video.

saifdj commented 5 years ago

Thanks for your quick response @wumbo ,

As you said we could get frames from VideoProcessor, which are series of images i guess.

but i need to save only audio file of the conversation. (not video). Please let me know if you have done it before.