pedroSG94 / RootEncoder

RootEncoder for Android (rtmp-rtsp-stream-client-java) is a stream encoder to push video/audio to media servers using protocols RTMP, RTSP, SRT and UDP with all code written in Java/Kotlin
Apache License 2.0
2.55k stars 773 forks source link

Streaming and recording with different quality Camera2 API #284

Closed HraD closed 3 years ago

HraD commented 5 years ago

Hi! I need to stream and record with different video quality using Camera2 API. I read this issue (https://github.com/pedroSG94/rtmp-rtsp-stream-client-java/issues/228), but it is for Camera1. As i understand, i need to duplicate VideoEncoder in Camera2Base.java, add this new VideoEncoder to listSurfaces in Camera2ApiManager.java's startPreview and set useOpengl to false in RtmpCamera2 init: rtmpCamera2 = new RtmpCamera2(App.getContext(), false, connectCheckerRtmp); Am i right? But after setting useOpengl to false (without any other modifications) i got errors: 2019-01-14 11:47:53.612 13103-13493/ru.altatec.server I/Camera2ApiManager: Camera opened 2019-01-14 11:47:53.613 13103-13493/ru.altatec.server E/Camera2ApiManager: Configuration failed Device is Sony Xperia XZ1 Compact Android 9 and camera working at background.

pedroSG94 commented 5 years ago

It is working for me with 2 VideoEncoders (Samsung s7). This is unstable, I only added start implementation to test so maybe crash on stop. Source code:

package com.pedro.rtplibrary.base;

import android.content.Context;
import android.media.MediaCodec;
import android.media.MediaFormat;
import android.media.MediaMuxer;
import android.os.Build;
import android.support.annotation.RequiresApi;
import android.util.Log;
import android.util.Size;
import android.view.MotionEvent;
import android.view.Surface;
import android.view.SurfaceView;
import android.view.TextureView;
import com.pedro.encoder.audio.AudioEncoder;
import com.pedro.encoder.audio.GetAacData;
import com.pedro.encoder.input.audio.GetMicrophoneData;
import com.pedro.encoder.input.audio.MicrophoneManager;
import com.pedro.encoder.input.video.Camera2ApiManager;
import com.pedro.encoder.input.video.CameraHelper;
import com.pedro.encoder.input.video.CameraOpenException;
import com.pedro.encoder.utils.CodecUtil;
import com.pedro.encoder.video.FormatVideoEncoder;
import com.pedro.encoder.video.GetVideoData;
import com.pedro.encoder.video.VideoEncoder;
import com.pedro.rtplibrary.view.GlInterface;
import com.pedro.rtplibrary.view.LightOpenGlView;
import com.pedro.rtplibrary.view.OffScreenGlThread;
import com.pedro.rtplibrary.view.OpenGlView;
import java.io.IOException;
import java.nio.ByteBuffer;
import java.util.Arrays;
import java.util.List;

/**
 * Wrapper to stream with camera2 api and microphone. Support stream with SurfaceView, TextureView,
 * OpenGlView(Custom SurfaceView that use OpenGl) and Context(background mode). All views use
 * Surface to buffer encoding mode for H264.
 *
 * API requirements:
 * API 21+.
 *
 * Created by pedro on 7/07/17.
 */
@RequiresApi(api = Build.VERSION_CODES.LOLLIPOP)
public abstract class Camera2Base implements GetAacData, GetVideoData, GetMicrophoneData {

  protected Context context;
  private Camera2ApiManager cameraManager;
  protected VideoEncoder videoEncoder;
  protected VideoEncoder videoEncoder2;
  private MicrophoneManager microphoneManager;
  private AudioEncoder audioEncoder;
  private boolean streaming = false;
  private SurfaceView surfaceView;
  private TextureView textureView;
  private GlInterface glInterface;
  private boolean videoEnabled = false;
  //record
  private MediaMuxer mediaMuxer;
  private int videoTrack = -1;
  private int audioTrack = -1;
  private boolean recording = false;
  private boolean canRecord = false;
  private boolean onPreview = false;
  private MediaFormat videoFormat;
  private MediaFormat audioFormat;
  private boolean isBackground = false;

  public Camera2Base(SurfaceView surfaceView) {
    this.surfaceView = surfaceView;
    this.context = surfaceView.getContext();
    init(context);
  }

  public Camera2Base(TextureView textureView) {
    this.textureView = textureView;
    this.context = textureView.getContext();
    init(context);
  }

  public Camera2Base(OpenGlView openGlView) {
    context = openGlView.getContext();
    glInterface = openGlView;
    glInterface.init();
    init(context);
  }

  public Camera2Base(LightOpenGlView lightOpenGlView) {
    this.context = lightOpenGlView.getContext();
    glInterface = lightOpenGlView;
    glInterface.init();
    init(context);
  }

  public Camera2Base(Context context, boolean useOpengl) {
    this.context = context;
    if (useOpengl) {
      glInterface = new OffScreenGlThread(context);
      glInterface.init();
    }
    isBackground = true;
    init(context);
  }

  private void init(Context context) {
    cameraManager = new Camera2ApiManager(context);
    videoEncoder = new VideoEncoder(this);
    videoEncoder2 = new VideoEncoder(new GetVideoData() {
      @Override
      public void onSpsPps(ByteBuffer sps, ByteBuffer pps) {

      }

      @Override
      public void onSpsPpsVps(ByteBuffer sps, ByteBuffer pps, ByteBuffer vps) {

      }

      @Override
      public void getVideoData(ByteBuffer h264Buffer, MediaCodec.BufferInfo info) {
        Log.e("Pedro", "asdasd");
      }

      @Override
      public void onVideoFormat(MediaFormat mediaFormat) {

      }
    });
    microphoneManager = new MicrophoneManager(this);
    audioEncoder = new AudioEncoder(this);
  }

  /**
   * Experimental
   */
  public void enableFaceDetection(Camera2ApiManager.FaceDetectorCallback faceDetectorCallback) {
    cameraManager.enableFaceDetection(faceDetectorCallback);
  }

  /**
   * Experimental
   */
  public void disableFaceDetection() {
    cameraManager.disableFaceDetection();
  }

  /**
   * Experimental
   */
  public boolean isFaceDetectionEnabled() {
    return cameraManager.isFaceDetectionEnabled();
  }

  public boolean isFrontCamera() {
    return cameraManager.isFrontCamera();
  }

  /**
   * Basic auth developed to work with Wowza. No tested with other server
   *
   * @param user auth.
   * @param password auth.
   */
  public abstract void setAuthorization(String user, String password);

  /**
   * Call this method before use @startStream. If not you will do a stream without video.
   *
   * @param width resolution in px.
   * @param height resolution in px.
   * @param fps frames per second of the stream.
   * @param bitrate H264 in kb.
   * @param hardwareRotation true if you want rotate using encoder, false if you with OpenGl if you
   * are using OpenGlView.
   * @param rotation could be 90, 180, 270 or 0 (Normally 0 if you are streaming in landscape or 90
   * if you are streaming in Portrait). This only affect to stream result. NOTE: Rotation with
   * encoder is silence ignored in some devices.
   * @return true if success, false if you get a error (Normally because the encoder selected
   * doesn't support any configuration seated or your device hasn't a H264 encoder).
   */
  public boolean prepareVideo(int width, int height, int fps, int bitrate, boolean hardwareRotation,
      int iFrameInterval, int rotation) {
    if (onPreview) {
      stopPreview();
      onPreview = true;
    }
    boolean result =
        videoEncoder.prepareVideoEncoder(width, height, fps, bitrate, rotation, hardwareRotation,
            iFrameInterval, FormatVideoEncoder.SURFACE);
    boolean result2 = videoEncoder2.prepareVideoEncoder(1280, 720, fps, bitrate, rotation, hardwareRotation,
        iFrameInterval, FormatVideoEncoder.SURFACE);
    prepareCameraManager();
    return result && result2;
  }

  /**
   * backward compatibility reason
   */
  public boolean prepareVideo(int width, int height, int fps, int bitrate, boolean hardwareRotation,
      int rotation) {
    return prepareVideo(width, height, fps, bitrate, hardwareRotation, 2, rotation);
  }

  protected abstract void prepareAudioRtp(boolean isStereo, int sampleRate);

  /**
   * Call this method before use @startStream. If not you will do a stream without audio.
   *
   * @param bitrate AAC in kb.
   * @param sampleRate of audio in hz. Can be 8000, 16000, 22500, 32000, 44100.
   * @param isStereo true if you want Stereo audio (2 audio channels), false if you want Mono audio
   * (1 audio channel).
   * @param echoCanceler true enable echo canceler, false disable.
   * @param noiseSuppressor true enable noise suppressor, false  disable.
   * @return true if success, false if you get a error (Normally because the encoder selected
   * doesn't support any configuration seated or your device hasn't a AAC encoder).
   */
  public boolean prepareAudio(int bitrate, int sampleRate, boolean isStereo, boolean echoCanceler,
      boolean noiseSuppressor) {
    microphoneManager.createMicrophone(sampleRate, isStereo, echoCanceler, noiseSuppressor);
    prepareAudioRtp(isStereo, sampleRate);
    return audioEncoder.prepareAudioEncoder(bitrate, sampleRate, isStereo);
  }

  /**
   * Same to call: isHardwareRotation = true; if (openGlVIew) isHardwareRotation = false;
   * prepareVideo(640, 480, 30, 1200 * 1024, isHardwareRotation, 90);
   *
   * @return true if success, false if you get a error (Normally because the encoder selected
   * doesn't support any configuration seated or your device hasn't a H264 encoder).
   */
  public boolean prepareVideo() {
    boolean isHardwareRotation = glInterface == null;
    int rotation = CameraHelper.getCameraOrientation(context);
    return prepareVideo(640, 480, 30, 1200 * 1024, isHardwareRotation, rotation);
  }

  /**
   * Same to call: prepareAudio(64 * 1024, 32000, true, false, false);
   *
   * @return true if success, false if you get a error (Normally because the encoder selected
   * doesn't support any configuration seated or your device hasn't a AAC encoder).
   */
  public boolean prepareAudio() {
    return prepareAudio(64 * 1024, 32000, true, false, false);
  }

  /**
   * @param forceVideo force type codec used. FIRST_COMPATIBLE_FOUND, SOFTWARE, HARDWARE
   * @param forceAudio force type codec used. FIRST_COMPATIBLE_FOUND, SOFTWARE, HARDWARE
   */
  public void setForce(CodecUtil.Force forceVideo, CodecUtil.Force forceAudio) {
    videoEncoder.setForce(forceVideo);
    audioEncoder.setForce(forceAudio);
  }

  /**
   * Start record a MP4 video. Need be called while stream.
   *
   * @param path where file will be saved.
   * @throws IOException If you init it before start stream.
   */
  public void startRecord(String path) throws IOException {
    mediaMuxer = new MediaMuxer(path, MediaMuxer.OutputFormat.MUXER_OUTPUT_MPEG_4);
    recording = true;
    if (!streaming) {
      startEncoders();
    } else if (videoEncoder.isRunning()) {
      resetVideoEncoder();
    }
  }

  /**
   * Stop record MP4 video started with @startRecord. If you don't call it file will be unreadable.
   */
  public void stopRecord() {
    recording = false;
    if (mediaMuxer != null) {
      if (canRecord) {
        mediaMuxer.stop();
        mediaMuxer.release();
        canRecord = false;
      }
      mediaMuxer = null;
    }
    videoTrack = -1;
    audioTrack = -1;
    if (!streaming) stopStream();
  }

  /**
   * Start camera preview. Ignored, if stream or preview is started.
   *
   * @param cameraFacing front or back camera. Like: {@link com.pedro.encoder.input.video.CameraHelper.Facing#BACK}
   * {@link com.pedro.encoder.input.video.CameraHelper.Facing#FRONT}
   * @param rotation camera rotation (0, 90, 180, 270). Recommended: {@link
   * com.pedro.encoder.input.video.CameraHelper#getCameraOrientation(Context)}
   */
  public void startPreview(CameraHelper.Facing cameraFacing, int rotation) {
    if (!isStreaming() && !onPreview && !isBackground) {
      if (surfaceView != null) {
        cameraManager.prepareCamera(surfaceView.getHolder().getSurface());
      } else if (textureView != null) {
        cameraManager.prepareCamera(new Surface(textureView.getSurfaceTexture()));
      } else if (glInterface != null) {
        boolean isCamera2Landscape = context.getResources().getConfiguration().orientation != 1;
        if (isCamera2Landscape) {
          glInterface.setEncoderSize(videoEncoder.getWidth(), videoEncoder.getHeight());
        } else {
          glInterface.setEncoderSize(videoEncoder.getHeight(), videoEncoder.getWidth());
        }
        glInterface.setRotation(rotation == 0 ? 270 : rotation - 90);
        glInterface.start();
        cameraManager.prepareCamera(glInterface.getSurfaceTexture(), videoEncoder.getWidth(),
            videoEncoder.getHeight());
      }
      cameraManager.openCameraFacing(cameraFacing);
      onPreview = true;
    }
  }

  public void startPreview(CameraHelper.Facing cameraFacing) {
    startPreview(cameraFacing, CameraHelper.getCameraOrientation(context));
  }

  public void startPreview() {
    startPreview(CameraHelper.Facing.BACK);
  }

  /**
   * Stop camera preview. Ignored if streaming or already stopped. You need call it after
   *
   * @stopStream to release camera properly if you will close activity.
   */
  public void stopPreview() {
    if (!isStreaming() && onPreview && !isBackground) {
      if (glInterface != null) {
        glInterface.stop();
      }
      cameraManager.closeCamera(false);
      onPreview = false;
    }
  }

  protected abstract void startStreamRtp(String url);

  /**
   * Need be called after @prepareVideo or/and @prepareAudio. This method override resolution of
   *
   * @param url of the stream like: protocol://ip:port/application/streamName
   *
   * RTSP: rtsp://192.168.1.1:1935/live/pedroSG94 RTSPS: rtsps://192.168.1.1:1935/live/pedroSG94
   * RTMP: rtmp://192.168.1.1:1935/live/pedroSG94 RTMPS: rtmps://192.168.1.1:1935/live/pedroSG94
   * @startPreview to resolution seated in @prepareVideo. If you never startPreview this method
   * startPreview for you to resolution seated in @prepareVideo.
   */
  public void startStream(String url) {
    if (!streaming) {
      startEncoders();
    } else if (videoEncoder.isRunning()) {
      resetVideoEncoder();
    }
    streaming = true;
    //if (!recording) {
    //  startEncoders();
    //} else {
    //  resetVideoEncoder();
    //}
    startStreamRtp(url);
    onPreview = true;
  }

  private void startEncoders() {
    videoEncoder.start();
    videoEncoder2.start();
    audioEncoder.start();
    prepareGlView();
    microphoneManager.start();
    if (onPreview) {
      cameraManager.openLastCamera();
    } else {
      cameraManager.openCameraBack();
    }
    onPreview = true;
  }

  private void resetVideoEncoder() {
    if (glInterface != null) {
      glInterface.removeMediaCodecSurface();
    }
    videoEncoder.reset();
    videoEncoder2.reset();
    if (glInterface != null) {
      glInterface.addMediaCodecSurface(videoEncoder.getInputSurface());
    } else {
      cameraManager.closeCamera(false);
      cameraManager.prepareCamera(videoEncoder.getInputSurface(), videoEncoder2.getInputSurface());
      cameraManager.openLastCamera();
    }
  }

  private void prepareGlView() {
    if (glInterface != null && videoEnabled) {
      if (glInterface instanceof OffScreenGlThread) {
        glInterface = new OffScreenGlThread(context);
        ((OffScreenGlThread) glInterface).setFps(videoEncoder.getFps());
      }
      glInterface.init();
      if (videoEncoder.getRotation() == 90 || videoEncoder.getRotation() == 270) {
        glInterface.setEncoderSize(videoEncoder.getHeight(), videoEncoder.getWidth());
      } else {
        glInterface.setEncoderSize(videoEncoder.getWidth(), videoEncoder.getHeight());
      }
      int rotation = videoEncoder.getRotation();
      glInterface.setRotation(rotation == 0 ? 270 : rotation - 90);
      glInterface.start();
      if (videoEncoder.getInputSurface() != null) {
        glInterface.addMediaCodecSurface(videoEncoder.getInputSurface());
      }
      cameraManager.prepareCamera(glInterface.getSurfaceTexture(), videoEncoder.getWidth(),
          videoEncoder.getHeight());
    }
  }

  protected abstract void stopStreamRtp();

  /**
   * Stop stream started with @startStream.
   */
  public void stopStream() {
    if (streaming) {
      streaming = false;
      stopStreamRtp();
    }
    if (!recording) {
      cameraManager.closeCamera(!isBackground);
      onPreview = !isBackground;
      microphoneManager.stop();
      if (glInterface != null) {
        glInterface.removeMediaCodecSurface();
        if (glInterface instanceof OffScreenGlThread) {
          glInterface.removeMediaCodecSurface();
          glInterface.stop();
        }
      }
      videoEncoder.stop();
      audioEncoder.stop();
      videoFormat = null;
      audioFormat = null;
    }
  }

  /**
   * Get supported preview resolutions of back camera in px.
   *
   * @return list of preview resolutions supported by back camera
   */
  public List<Size> getResolutionsBack() {
    return Arrays.asList(cameraManager.getCameraResolutionsBack());
  }

  /**
   * Get supported preview resolutions of front camera in px.
   *
   * @return list of preview resolutions supported by front camera
   */
  public List<Size> getResolutionsFront() {
    return Arrays.asList(cameraManager.getCameraResolutionsFront());
  }

  /**
   * Mute microphone, can be called before, while and after stream.
   */
  public void disableAudio() {
    microphoneManager.mute();
  }

  /**
   * Enable a muted microphone, can be called before, while and after stream.
   */
  public void enableAudio() {
    microphoneManager.unMute();
  }

  /**
   * Get mute state of microphone.
   *
   * @return true if muted, false if enabled
   */
  public boolean isAudioMuted() {
    return microphoneManager.isMuted();
  }

  /**
   * Get video camera state
   *
   * @return true if disabled, false if enabled
   */
  public boolean isVideoEnabled() {
    return videoEnabled;
  }

  /**
   * Disable send camera frames and send a black image with low bitrate(to reduce bandwith used)
   * instance it.
   */
  public void disableVideo() {
    videoEncoder.startSendBlackImage();
    videoEnabled = false;
  }

  /**
   * Enable send camera frames.
   */
  public void enableVideo() {
    videoEncoder.stopSendBlackImage();
    videoEnabled = true;
  }

  /**
   * Set zoomIn or zoomOut to camera.
   *
   * @param event motion event. Expected to get event.getPointerCount() > 1
   */
  public void setZoom(MotionEvent event) {
    cameraManager.setZoom(event);
  }

  public int getBitrate() {
    return videoEncoder.getBitRate();
  }

  public int getResolutionValue() {
    return videoEncoder.getWidth() * videoEncoder.getHeight();
  }

  public int getStreamWidth() {
    return videoEncoder.getWidth();
  }

  public int getStreamHeight() {
    return videoEncoder.getHeight();
  }

  /**
   * Switch camera used. Can be called on preview or while stream, ignored with preview off.
   *
   * @throws CameraOpenException If the other camera doesn't support same resolution.
   */
  public void switchCamera() throws CameraOpenException {
    if (isStreaming() || onPreview) {
      cameraManager.switchCamera();
    }
  }

  public GlInterface getGlInterface() {
    if (glInterface != null) {
      return glInterface;
    } else {
      throw new RuntimeException("You can't do it. You are not using Opengl");
    }
  }

  private void prepareCameraManager() {
    if (textureView != null) {
      cameraManager.prepareCamera(textureView, videoEncoder.getInputSurface());
    } else if (surfaceView != null) {
      cameraManager.prepareCamera(surfaceView, videoEncoder.getInputSurface());
    } else if (glInterface != null) {
    } else {
      cameraManager.prepareCamera(videoEncoder.getInputSurface(), videoEncoder2.getInputSurface());
    }
    videoEnabled = true;
  }

  /**
   * Set video bitrate of H264 in kb while stream.
   *
   * @param bitrate H264 in kb.
   */
  public void setVideoBitrateOnFly(int bitrate) {
    videoEncoder.setVideoBitrateOnFly(bitrate);
  }

  /**
   * Set limit FPS while stream. This will be override when you call to prepareVideo method. This
   * could produce a change in iFrameInterval.
   *
   * @param fps frames per second
   */
  public void setLimitFPSOnFly(int fps) {
    videoEncoder.setFps(fps);
  }

  /**
   * Get stream state.
   *
   * @return true if streaming, false if not streaming.
   */
  public boolean isStreaming() {
    return streaming;
  }

  /**
   * Get record state.
   *
   * @return true if recording, false if not recoding.
   */
  public boolean isRecording() {
    return recording;
  }

  /**
   * Get preview state.
   *
   * @return true if enabled, false if disabled.
   */
  public boolean isOnPreview() {
    return onPreview;
  }

  protected abstract void getAacDataRtp(ByteBuffer aacBuffer, MediaCodec.BufferInfo info);

  @Override
  public void getAacData(ByteBuffer aacBuffer, MediaCodec.BufferInfo info) {
    if (canRecord && recording) mediaMuxer.writeSampleData(audioTrack, aacBuffer, info);
    if (streaming) getAacDataRtp(aacBuffer, info);
  }

  protected abstract void onSpsPpsVpsRtp(ByteBuffer sps, ByteBuffer pps, ByteBuffer vps);

  @Override
  public void onSpsPps(ByteBuffer sps, ByteBuffer pps) {
    if (streaming) onSpsPpsVpsRtp(sps, pps, null);
  }

  @Override
  public void onSpsPpsVps(ByteBuffer sps, ByteBuffer pps, ByteBuffer vps) {
    if (streaming) onSpsPpsVpsRtp(sps, pps, vps);
  }

  protected abstract void getH264DataRtp(ByteBuffer h264Buffer, MediaCodec.BufferInfo info);

  @Override
  public void getVideoData(ByteBuffer h264Buffer, MediaCodec.BufferInfo info) {
    if (recording) {
      if (info.flags == MediaCodec.BUFFER_FLAG_KEY_FRAME
          && !canRecord
          && videoFormat != null
          && audioFormat != null) {
        videoTrack = mediaMuxer.addTrack(videoFormat);
        audioTrack = mediaMuxer.addTrack(audioFormat);
        mediaMuxer.start();
        canRecord = true;
      }
      if (canRecord) mediaMuxer.writeSampleData(videoTrack, h264Buffer, info);
    }
    if (streaming) getH264DataRtp(h264Buffer, info);
  }

  @Override
  public void inputPCMData(byte[] buffer, int size) {
    audioEncoder.inputPCMData(buffer, size);
  }

  @Override
  public void onVideoFormat(MediaFormat mediaFormat) {
    videoFormat = mediaFormat;
  }

  @Override
  public void onAudioFormat(MediaFormat mediaFormat) {
    audioFormat = mediaFormat;
  }
}
package com.pedro.encoder.input.video;

import android.annotation.SuppressLint;
import android.content.Context;
import android.graphics.Rect;
import android.graphics.SurfaceTexture;
import android.hardware.camera2.CameraAccessException;
import android.hardware.camera2.CameraCaptureSession;
import android.hardware.camera2.CameraCharacteristics;
import android.hardware.camera2.CameraDevice;
import android.hardware.camera2.CameraManager;
import android.hardware.camera2.CameraMetadata;
import android.hardware.camera2.CaptureRequest;
import android.hardware.camera2.CaptureResult;
import android.hardware.camera2.TotalCaptureResult;
import android.hardware.camera2.params.Face;
import android.hardware.camera2.params.StreamConfigurationMap;
import android.os.Build;
import android.os.Handler;
import android.os.HandlerThread;
import android.support.annotation.NonNull;
import android.support.annotation.RequiresApi;
import android.util.Log;
import android.util.Size;
import android.view.MotionEvent;
import android.view.Surface;
import android.view.SurfaceView;
import android.view.TextureView;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collections;
import java.util.List;

import static android.hardware.camera2.CameraMetadata.LENS_FACING_FRONT;

/**
 * Created by pedro on 4/03/17.
 *
 * <p>
 * Class for use surfaceEncoder to buffer encoder.
 * Advantage = you can use all resolutions.
 * Disadvantages = you cant control fps of the stream, because you cant know when the inputSurface
 * was renderer.
 * <p>
 * Note: you can use opengl for surfaceEncoder to buffer encoder on devices 21 < API > 16:
 * https://github.com/google/grafika
 */

@RequiresApi(api = Build.VERSION_CODES.LOLLIPOP)
public class Camera2ApiManager extends CameraDevice.StateCallback {

  private final String TAG = "Camera2ApiManager";

  private CameraDevice cameraDevice;
  private SurfaceView surfaceView;
  private TextureView textureView;
  private Surface surfaceEncoder; //input surfaceEncoder from videoEncoder
  private CameraManager cameraManager;
  private Handler cameraHandler;
  private CameraCaptureSession cameraCaptureSession;
  private boolean prepared = false;
  private int cameraId = -1;
  private Surface preview;
  private boolean isOpenGl = false;
  private boolean isFrontCamera = false;
  private CameraCharacteristics cameraCharacteristics;
  private CaptureRequest.Builder builderPreview;
  private CaptureRequest.Builder builderInputSurface;
  private float fingerSpacing = 0;
  private int zoomLevel = 1;
  private Surface surfaceEncoder2;

  //Face detector
  public interface FaceDetectorCallback {
    void onGetFaces(Face[] faces);
  }

  private FaceDetectorCallback faceDetectorCallback;
  private boolean faceDetectionEnabled = false;
  private int faceDetectionMode;

  public Camera2ApiManager(Context context) {
    cameraManager = (CameraManager) context.getSystemService(Context.CAMERA_SERVICE);
  }

  public void prepareCamera(SurfaceView surfaceView, Surface surface) {
    this.surfaceView = surfaceView;
    this.surfaceEncoder = surface;
    prepared = true;
    isOpenGl = false;
  }

  public void prepareCamera(TextureView textureView, Surface surface) {
    this.textureView = textureView;
    this.surfaceEncoder = surface;
    prepared = true;
    isOpenGl = false;
  }

  public void prepareCamera(Surface surface) {
    this.surfaceEncoder = surface;
    prepared = true;
    isOpenGl = false;
  }

  public void prepareCamera(Surface surface, Surface surface2) {
    this.surfaceEncoder = surface;
    this.surfaceEncoder2 = surface2;
    prepared = true;
    isOpenGl = false;
  }

  public void prepareCamera(SurfaceTexture surfaceTexture, int width, int height) {
    surfaceTexture.setDefaultBufferSize(width, height);
    this.surfaceEncoder = new Surface(surfaceTexture);
    prepared = true;
    isOpenGl = true;
  }

  public boolean isPrepared() {
    return prepared;
  }

  private void startPreview(CameraDevice cameraDevice) {
    try {
      List<Surface> listSurfaces = new ArrayList<>();
      preview = addPreviewSurface();
      if (preview != null) {
        listSurfaces.add(preview);
      }
      if (surfaceEncoder != null) {
        listSurfaces.add(surfaceEncoder);
      }
      if (surfaceEncoder2 != null) {
        listSurfaces.add(surfaceEncoder2);
      }
      cameraDevice.createCaptureSession(listSurfaces, new CameraCaptureSession.StateCallback() {
        @Override
        public void onConfigured(@NonNull CameraCaptureSession cameraCaptureSession) {
          Camera2ApiManager.this.cameraCaptureSession = cameraCaptureSession;
          try {
            if (surfaceView != null || textureView != null) {
              cameraCaptureSession.setRepeatingBurst(
                  Arrays.asList(drawSurface(preview), drawSurface(surfaceEncoder)),
                  faceDetectionEnabled ? cb : null, cameraHandler);
            } else {
              cameraCaptureSession.setRepeatingBurst(
                  Arrays.asList(drawSurface(surfaceEncoder), drawSurface(surfaceEncoder2)),
                  faceDetectionEnabled ? cb : null, cameraHandler);
            }
            Log.i(TAG, "Camera configured");
          } catch (CameraAccessException | NullPointerException e) {
            Log.e(TAG, "Error", e);
          }
        }

        @Override
        public void onConfigureFailed(@NonNull CameraCaptureSession cameraCaptureSession) {
          cameraCaptureSession.close();
          Log.e(TAG, "Configuration failed");
        }
      }, null);
    } catch (CameraAccessException e) {
      Log.e(TAG, "Error", e);
    }
  }

  private Surface addPreviewSurface() {
    Surface surface = null;
    if (surfaceView != null) {
      surface = surfaceView.getHolder().getSurface();
    } else if (textureView != null) {
      final SurfaceTexture texture = textureView.getSurfaceTexture();
      surface = new Surface(texture);
    }
    return surface;
  }

  private CaptureRequest drawSurface(Surface surface) {
    try {
      builderInputSurface = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
      builderInputSurface.addTarget(surface);
      return builderInputSurface.build();
    } catch (CameraAccessException | IllegalStateException e) {
      Log.e(TAG, "Error", e);
      return null;
    }
  }

  public int getLevelSupported() {
    try {
      cameraCharacteristics = cameraManager.getCameraCharacteristics("0");
      return cameraCharacteristics.get(CameraCharacteristics.INFO_SUPPORTED_HARDWARE_LEVEL);
    } catch (CameraAccessException | IllegalStateException e) {
      Log.e(TAG, "Error", e);
      return -1;
    }
  }

  public void openCamera() {
    openCameraBack();
  }

  public void openCameraBack() {
    openCameraFacing(CameraHelper.Facing.BACK);
  }

  public void openCameraFront() {
    openCameraFacing(CameraHelper.Facing.FRONT);
  }

  public void openLastCamera() {
    if (cameraId == -1) {
      openCameraBack();
    } else {
      openCameraId(cameraId);
    }
  }

  public Size[] getCameraResolutionsBack() {
    try {
      cameraCharacteristics = cameraManager.getCameraCharacteristics("0");
      if (cameraCharacteristics.get(CameraCharacteristics.LENS_FACING)
          != CameraCharacteristics.LENS_FACING_BACK) {
        cameraCharacteristics = cameraManager.getCameraCharacteristics("1");
      }
      StreamConfigurationMap streamConfigurationMap =
          cameraCharacteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);
      return streamConfigurationMap.getOutputSizes(SurfaceTexture.class);
    } catch (CameraAccessException e) {
      Log.e(TAG, "Error", e);
      return new Size[0];
    }
  }

  public Size[] getCameraResolutionsFront() {
    try {
      cameraCharacteristics = cameraManager.getCameraCharacteristics("0");
      if (cameraCharacteristics.get(CameraCharacteristics.LENS_FACING)
          != CameraCharacteristics.LENS_FACING_FRONT) {
        cameraCharacteristics = cameraManager.getCameraCharacteristics("1");
      }
      StreamConfigurationMap streamConfigurationMap =
          cameraCharacteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);
      return streamConfigurationMap.getOutputSizes(SurfaceTexture.class);
    } catch (CameraAccessException e) {
      Log.e(TAG, "Error", e);
      return new Size[0];
    }
  }

  /**
   * Select camera facing
   *
   * @param cameraFacing - CameraCharacteristics.LENS_FACING_FRONT, CameraCharacteristics.LENS_FACING_BACK,
   * CameraCharacteristics.LENS_FACING_EXTERNAL
   */
  public void openCameraFacing(CameraHelper.Facing cameraFacing) {
    int facing = cameraFacing == CameraHelper.Facing.BACK ? CameraMetadata.LENS_FACING_BACK
        : CameraMetadata.LENS_FACING_FRONT;
    try {
      cameraCharacteristics = cameraManager.getCameraCharacteristics("0");
      if (cameraCharacteristics.get(CameraCharacteristics.LENS_FACING) == facing) {
        openCameraId(0);
      } else {
        openCameraId(cameraManager.getCameraIdList().length - 1);
      }
    } catch (CameraAccessException e) {
      Log.e(TAG, "Error", e);
    }
  }

  public void enableFaceDetection(FaceDetectorCallback faceDetectorCallback) {
    int[] fd = cameraCharacteristics.get(
        CameraCharacteristics.STATISTICS_INFO_AVAILABLE_FACE_DETECT_MODES);
    int maxFD = cameraCharacteristics.get(CameraCharacteristics.STATISTICS_INFO_MAX_FACE_COUNT);
    if (fd.length > 0) {
      List<Integer> fdList = new ArrayList<>();
      for (int FaceD : fd) {
        fdList.add(FaceD);
      }
      if (maxFD > 0) {
        this.faceDetectorCallback = faceDetectorCallback;
        faceDetectionEnabled = true;
        faceDetectionMode = Collections.max(fdList);
        if (builderPreview != null) setFaceDetect(builderPreview, faceDetectionMode);
        setFaceDetect(builderInputSurface, faceDetectionMode);
        prepareFaceDetectionCallback();
      } else {
        Log.e(TAG, "No face detection");
      }
    } else {
      Log.e(TAG, "No face detection");
    }
  }

  public void disableFaceDetection() {
    if (faceDetectionEnabled) {
      faceDetectorCallback = null;
      faceDetectionEnabled = false;
      faceDetectionMode = 0;
      prepareFaceDetectionCallback();
    }
  }

  public boolean isFaceDetectionEnabled() {
    return faceDetectorCallback != null;
  }

  private void setFaceDetect(CaptureRequest.Builder requestBuilder, int faceDetectMode) {
    if (faceDetectionEnabled) {
      requestBuilder.set(CaptureRequest.STATISTICS_FACE_DETECT_MODE, faceDetectMode);
    }
  }

  private void prepareFaceDetectionCallback() {
    try {
      cameraCaptureSession.stopRepeating();
      if (builderPreview != null) {
        cameraCaptureSession.setRepeatingRequest(builderPreview.build(),
            faceDetectionEnabled ? cb : null, null);
      }
      cameraCaptureSession.setRepeatingRequest(builderInputSurface.build(),
          faceDetectionEnabled ? cb : null, null);
    } catch (CameraAccessException e) {
      Log.e(TAG, "Error", e);
    }
  }

  private final CameraCaptureSession.CaptureCallback cb =
      new CameraCaptureSession.CaptureCallback() {

        @Override
        public void onCaptureCompleted(@NonNull CameraCaptureSession session,
            @NonNull CaptureRequest request, @NonNull TotalCaptureResult result) {
          Face[] faces = result.get(CaptureResult.STATISTICS_FACES);
          if (faceDetectorCallback != null) {
            faceDetectorCallback.onGetFaces(faces);
          }
        }
      };

  @SuppressLint("MissingPermission")
  public void openCameraId(Integer cameraId) {
    this.cameraId = cameraId;
    if (prepared) {
      HandlerThread cameraHandlerThread = new HandlerThread(TAG + " Id = " + cameraId);
      cameraHandlerThread.start();
      cameraHandler = new Handler(cameraHandlerThread.getLooper());
      try {
        cameraManager.openCamera(cameraId.toString(), this, cameraHandler);
        cameraCharacteristics = cameraManager.getCameraCharacteristics(Integer.toString(cameraId));
        isFrontCamera =
            (LENS_FACING_FRONT == cameraCharacteristics.get(CameraCharacteristics.LENS_FACING));
      } catch (CameraAccessException | SecurityException e) {
        Log.e(TAG, "Error", e);
      }
    } else {
      Log.e(TAG, "Camera2ApiManager need be prepared, Camera2ApiManager not enabled");
    }
  }

  public void switchCamera() {
    if (cameraDevice != null) {
      int cameraId = Integer.parseInt(cameraDevice.getId()) == 1 ? 0 : 1;
      closeCamera(false);
      prepared = true;
      openCameraId(cameraId);
    }
  }

  public void setZoom(MotionEvent event) {
    try {
      float maxZoom =
          (cameraCharacteristics.get(CameraCharacteristics.SCALER_AVAILABLE_MAX_DIGITAL_ZOOM)) * 10;
      Rect m = cameraCharacteristics.get(CameraCharacteristics.SENSOR_INFO_ACTIVE_ARRAY_SIZE);
      float currentFingerSpacing;

      if (event.getPointerCount() > 1) {
        // Multi touch logic
        currentFingerSpacing = CameraHelper.getFingerSpacing(event);
        if (fingerSpacing != 0) {
          if (currentFingerSpacing > fingerSpacing && maxZoom > zoomLevel) {
            zoomLevel++;
          } else if (currentFingerSpacing < fingerSpacing && zoomLevel > 1) {
            zoomLevel--;
          }
          int minW = (int) (m.width() / maxZoom);
          int minH = (int) (m.height() / maxZoom);
          int difW = m.width() - minW;
          int difH = m.height() - minH;
          int cropW = difW / 100 * zoomLevel;
          int cropH = difH / 100 * zoomLevel;
          cropW -= cropW & 3;
          cropH -= cropH & 3;
          Rect zoom = new Rect(cropW, cropH, m.width() - cropW, m.height() - cropH);
          if (builderPreview != null) builderPreview.set(CaptureRequest.SCALER_CROP_REGION, zoom);
          builderInputSurface.set(CaptureRequest.SCALER_CROP_REGION, zoom);
        }
        fingerSpacing = currentFingerSpacing;
      }
      if (builderPreview != null) {
        cameraCaptureSession.setRepeatingRequest(builderPreview.build(),
            faceDetectionEnabled ? cb : null, null);
      }
      cameraCaptureSession.setRepeatingRequest(builderInputSurface.build(),
          faceDetectionEnabled ? cb : null, null);
    } catch (CameraAccessException e) {
      Log.e(TAG, "Error", e);
    }
  }

  public boolean isFrontCamera() {
    return isFrontCamera;
  }

  public void closeCamera(boolean reOpen) {
    if (reOpen) {
      try {
        cameraCaptureSession.stopRepeating();
        if (surfaceView != null || textureView != null) {
          cameraCaptureSession.setRepeatingBurst(Collections.singletonList(drawSurface(preview)),
              null, cameraHandler);
        } else if (surfaceEncoder != null && isOpenGl) {
          cameraCaptureSession.setRepeatingBurst(
              Collections.singletonList(drawSurface(surfaceEncoder)), null, cameraHandler);
        }
      } catch (Exception e) {
        Log.e(TAG, "Error", e);
      }
    } else {
      if (cameraCaptureSession != null) {
        cameraCaptureSession.close();
        cameraCaptureSession = null;
      }
      if (cameraDevice != null) {
        cameraDevice.close();
        cameraDevice = null;
      }
      if (cameraHandler != null) {
        cameraHandler.getLooper().quitSafely();
        cameraHandler = null;
      }
      prepared = false;
      builderPreview = null;
      builderInputSurface = null;
    }
  }

  @Override
  public void onOpened(@NonNull CameraDevice cameraDevice) {
    this.cameraDevice = cameraDevice;
    startPreview(cameraDevice);
    Log.i(TAG, "Camera opened");
  }

  @Override
  public void onDisconnected(@NonNull CameraDevice cameraDevice) {
    cameraDevice.close();
    Log.i(TAG, "Camera disconnected");
  }

  @Override
  public void onError(@NonNull CameraDevice cameraDevice, int i) {
    cameraDevice.close();
    Log.e(TAG, "Open failed");
  }
}
HraD commented 5 years ago

Thanks for code, but i still got "Configuration failed" in Camera2ApiManager.java's startPreview when useOpengl is false.

pedroSG94 commented 5 years ago

This is working for me in Nexus5 and Samsung S7.

If the problem persist I can't do more maybe the problem is hardware compability with camera2 api.

HraD commented 5 years ago

Yes, it works with your sample app. In your case streamConfigurationMap.isOutputSupportedFor(surfaceEncoder) returns true, in mine case it returns false, and i don't know why so...

floriangbh commented 4 years ago

Hello, I'm also trying to record in different quality of the stream with camera2 (record in 720p for exemple and streaming in 360p). If I well understrand this is not possible to do it without modifying the source code of this library myself ? Thanks

pedroSG94 commented 4 years ago

Yes, you are right. You need modify library code. You have an example posted above but maybe you need fix something. I'm not doing this implementation because you need create 2 videoencoders to do it and this result in worse performance and it is more difficult to maintain

adderly commented 3 years ago

@pedroSG94 I was trying the proposed examples, and streaming was breaking. Is there a way to achieve this with the openglview? Creating two encoders and so on.

I have an idea to copy the surface in the meantime, but needs too much modification on the current base code and it can be really slow that way.

pedroSG94 commented 3 years ago

@pedroSG94 I was trying the proposed examples, and streaming was breaking. Is there a way to achieve this with the openglview? Creating two encoders and so on.

I have an idea to copy the surface in the meantime, but needs too much modification on the current base code and it can be really slow that way.

No, because openglview only support 2 surfaces (preview and encoder). You need modify SurfaceManager to make it work and create a way to scale image correctly to both encoders.