Open ghost opened 9 years ago
If you are using MediaCodec method(buffer-buffer) then it should be possible to capture an image(of the same resolution as video)
In function: encodeWithMediaCodecMethod1(),
Camera.PreviewCallback callback = new Camera.PreviewCallback() {
...
@Override
public void onPreviewFrame(byte[] data, Camera camera) {
...
try {
...
} finally {
<< Invoke Android JPEG encoder to encode the YUV frame in data[] before returning the frame back for circulation >>
mCamera.addCallbackBuffer(data);
}
}
};
As for the sample code for actual jpeg encoding itself, please check this thread http://stackoverflow.com/questions/11039783/my-jni-jpeg-encoder-for-android-is-really-slow
I want to capture an image from the camera while streaming is running. How can I do it with libstreaming? Represent sample codes for solving the problem, please.