pedroSG94 / RTSP-Server

Plugin of rtmp-rtsp-stream-client-java to stream directly to RTSP player.
Apache License 2.0
214 stars 66 forks source link

Streaming using Iristick headset #115

Closed ronaldsampaio closed 9 months ago

ronaldsampaio commented 9 months ago

Hello! I have a proprietary hardware camera that gives me a SDK to work with the cameras. I'm working torward stream the video feed as I did with previus USB Câmera. Now I have this information:

Real-time streaming Real-time streaming can be performed by using third-party libraries that allow the input video feed to be provided frame by frame. The SDK includes the STREAMING profile for dealing with this use case. You will want to add an output surface, such as an ImageReader, to receive each captured frame.

and code example

private ImageReader mImageReader;
private TextureView mPreview;

private HandlerThread mBackgroundThread;

@Override
public void onHeadsetAvailable(@NonNull IRIHeadset headset) {
    final IRICamera camera = headset.findCamera(IRICameraType.WIDE_ANGLE);
    if (camera != null) {
        if (mImageReader == null) {
            mImageReader = ImageReader.newInstance(1280, 720, ImageFormat.YV12, 2);
            mImageReader.setOnImageAvailableListener(this::onImageAvailable,
                                                     new Handler(mBackgroundThread.getLooper()));
        }
        camera.openSession(getLifecycle(), IRICameraProfile.STREAMING, it -> {
            it.addOutput(mImageReader.getSurface());
            it.addOutput(mPreview); // optional
            it.setFrameSize(1280, 720);
        });
    }
}

private void onImageAvailable(ImageReader reader) {
    try (Image image = reader.acquireLatestImage()) {
        if (image != null) {
            // TODO: do something with the image
        }
    }
}

I created a IristickCameraSource

class IristickCameraSource(private val camera: IRICamera) : VideoSource() {

    private lateinit var surface: Surface

    override fun create(width: Int, height: Int, fps: Int): Boolean {
        return true
    }

    override fun start(surfaceTexture: SurfaceTexture) {
        surface  = Surface(surfaceTexture)
        camera.openSession(null,IRICameraProfile.STREAMING) {
            it.addOutput(surface)
            it.setFrameSize(1280,720)
        }
    }

    override fun isRunning(): Boolean {
        return true
    }

    override fun release() {
        surface.release()
    }

    override fun stop() {
    }

    fun getSurface() : Surface{
        return surface
    }

}

and used their SDK to set up when the camera is connected

    override fun onHeadsetAvailable(headset: IRIHeadset) {
        camera = headset.findCamera(IRICameraType.WIDE_ANGLE)!!
        if (camera!=null) Log.d("Streaming", "GOT CAMERA!")
        loadDevices()
    }
    fun loadDevices(){
        rtspServerStream = RtspServerStream(context,portNum,this,IristickCameraSource(camera),MicrophoneSource())
        val prepared = rtspServerStream.prepareVideo(1280, 720, 4000000) && rtspServerStream.prepareAudio(48000,false,128000)
        if (prepared){
            Log.d("Streaming","Server Prepared!")
            Log.d("Streaming","Trying to start STREAM...")
            rtspServerStream.startStream()

        }
        else Log.d("Streaming", "Server not prepared!")

        _isLoading.value = false
    }

I'm gettin the traditional Connection Failed: video info is null error, meaning that it is not receiving video data. What do you think would be the best approach here? Is there a way to use this ImageReader to send video?

pedroSG94 commented 9 months ago

Hello,

Maybe the problem is related with the lifecycle set to null. If that lifecycle is a LifecycleOwner, you can add it like in CameraXSource: https://github.com/pedroSG94/RootEncoder/blob/master/app/src/main/java/com/pedro/streamer/rotation/CameraXSource.kt#L48

About use the ImageReader. You can try like this:

@RequiresApi(Build.VERSION_CODES.KITKAT)
class ImageReaderSource(
  private val imageReader: ImageReader,
  context: Context
): VideoSource() {

  private val glStreamInterface = GlStreamInterface(context)

  override fun create(width: Int, height: Int, fps: Int): Boolean {
    this.width = width
    this.height = height
    glStreamInterface.addMediaCodecSurface(imageReader.surface)
    created = true
    return true
  }

  override fun start(surfaceTexture: SurfaceTexture) {
    glStreamInterface.attachPreview(Surface(surfaceTexture))
    glStreamInterface.setPreviewResolution(width, height)
    glStreamInterface.start()
  }

  override fun stop() {
    glStreamInterface.removeMediaCodecSurface()
    glStreamInterface.deAttachPreview()
    glStreamInterface.stop()
  }

  override fun release() {
  }

  override fun isRunning(): Boolean = glStreamInterface.running
}

Let me explain the idea with this. The idea is capture frames from ImageReader surface using GlStreamInterface and copy it to SurfaceTexture provided by VideoSource. Also, you can try move glStreamInterface.addMediaCodecSurface after glStreamInterface.setPreviewResolution

This is not tested so I don't know if this could work. Let me know the result.

ronaldsampaio commented 9 months ago

Ok so I tried to implement like this: MainPageViewModel that is initialized with the page.

@SuppressLint("StaticFieldLeak")
@KoinViewModel
class MainPageViewModel(private val context: Context) : ViewModel(),IRIListener, ConnectChecker, ClientListener{
    private lateinit var rtspServerStream: RtspServerStream
    private val portNum = 18550
    private lateinit var  camera : IRICamera
    private var imageReader : ImageReader = ImageReader.newInstance(1280, 720, ImageFormat.YV12, 2)

    private val _isLoading = MutableStateFlow(true)
    val isLoading = _isLoading.asStateFlow()

    init {
        IristickSDK.registerListener(null,this)
        Log.d("Streaming","CREATED MainPageViewModel!")
    }

    fun loadDevices(){
        rtspServerStream = RtspServerStream(context,portNum,this,IristickCameraSource(camera, context, imageReader),MicrophoneSource())
        val prepared = rtspServerStream.prepareVideo(1280, 720, 4000000) && rtspServerStream.prepareAudio(48000,false,128000)
        if (prepared){
            Log.d("Streaming","Server Prepared!")
            Log.d("Streaming","Trying to start STREAM...")
            rtspServerStream.startStream()

        }
        else Log.d("Streaming", "Server not prepared!")
        _isLoading.value = false
    }

    override fun onHeadsetAvailable(headset: IRIHeadset) {
        camera = headset.findCamera(IRICameraType.WIDE_ANGLE)!!
        if (camera!=null) Log.d("Streaming", "GOT CAMERA!")
        loadDevices()
    }

(...)

}

The IristickCameraSource

class IristickCameraSource(
    private val camera: IRICamera,
    context: Context,
    private val imageReader: ImageReader
) : VideoSource() , LifecycleOwner{

    private lateinit var surface: Surface
    private val lifecycleRegistry = LifecycleRegistry(this)

    private val glStreamInterface = GlStreamInterface(context)

    override fun create(width: Int, height: Int, fps: Int): Boolean {
        this.width = width
        this.height = height
        lifecycleRegistry.handleLifecycleEvent(Lifecycle.Event.ON_CREATE)
        created = true
        return true
    }

    override fun start(surfaceTexture: SurfaceTexture) {
        lifecycleRegistry.handleLifecycleEvent(Lifecycle.Event.ON_START)
        Log.d("Streaming","Trying to Start GlStreamInterface... ")
        glStreamInterface.attachPreview(Surface(surfaceTexture))
        glStreamInterface.setPreviewResolution(width, height)
        glStreamInterface.addMediaCodecSurface(imageReader.surface)
        glStreamInterface.start()
        Log.d("Streaming","Started GlStreamInterface")
        camera.openSession(this.lifecycleRegistry, IRICameraProfile.STREAMING) {
            it.addOutput(imageReader.surface)
            it.setFrameSize(this.width,this.height)
        }
    }

    override fun isRunning(): Boolean = glStreamInterface.running

    override fun release() {
        lifecycleRegistry.handleLifecycleEvent(Lifecycle.Event.ON_DESTROY)
        surface.release()
    }

    override fun stop() {
        lifecycleRegistry.handleLifecycleEvent(Lifecycle.Event.ON_STOP)
        glStreamInterface.removeMediaCodecSurface()
        glStreamInterface.deAttachPreview()
        glStreamInterface.stop()
    }

    fun getSurface() : Surface{
        return surface
    }

    override val lifecycle: Lifecycle
        get() = lifecycleRegistry

}

I'm getting the error as shown here Screenshot (6)

pedroSG94 commented 9 months ago

Hello,

First you can try your first suggested way but using lifecycleowner as suggested (I implemented lifecycleRegistry and fixed isRunning method):

class IristickCameraSource(private val camera: IRICamera) : VideoSource(), LifecycleOwner {

    private lateinit var surface: Surface
    private val lifecycleRegistry = LifecycleRegistry(this)
    private var running = false

    override fun create(width: Int, height: Int, fps: Int): Boolean {
        this.width = width
        this.height = height
        this.fps = fps
        created = true
        lifecycleRegistry.handleLifecycleEvent(Lifecycle.Event.ON_CREATE)
        return true
    }

    override fun start(surfaceTexture: SurfaceTexture) {
        surface  = Surface(surfaceTexture)
        camera.openSession(lifecycleRegistry, IRICameraProfile.STREAMING) {
            it.addOutput(surface)
            it.setFrameSize(width, height)
        }
        lifecycleRegistry.handleLifecycleEvent(Lifecycle.Event.ON_START)
        running = true;
    }

    override fun isRunning(): Boolean {
        return running
    }

    override fun release() {
        lifecycleRegistry.handleLifecycleEvent(Lifecycle.Event.ON_DESTROY)
    }

    override fun stop() {
        running = false
        lifecycleRegistry.handleLifecycleEvent(Lifecycle.Event.ON_STOP)
    }

    fun getSurface() : Surface{
        return surface
    }

    override val lifecycle: Lifecycle
        get() = lifecycleRegistry

}

About your last implementation. Try to use setEncoderSize to GlStreamInterface in onCreate method with the parameters of the method and move lifecycleRegistry.handleLifecycleEvent(Lifecycle.Event.ON_START) to last line of start method.

If it is not working I will try experiment with ImageReader because I'm not sure if use it like that is possible but I think that VideoSource without GlStreamInterface is better if it is possible and feels like it is possible according with code example using an external preview

ronaldsampaio commented 9 months ago

Hey. So both implementations are now giving the Connection Failed: video info is null error. The first is the one I suggested but with your lifecycle modifications and the second is the same of my second implementation but with glStreamInterface.setEncoderSize(width,height) inside create method and lifecycleRegistry.handleLifecycleEvent(Lifecycle.Event.ON_START) to the end of the start method

pedroSG94 commented 9 months ago

I think we should focus in the first solution because if it is working with a ImageRender, it should work with a SurfaceTexture. We can check if it is still working with a ImageReader inside the VideoSource created to discard problems with this:

class IristickCameraSource(private val camera: IRICamera) : VideoSource(), LifecycleOwner {

  private val lifecycleRegistry = LifecycleRegistry(this)
  private var imageReader : ImageReader = ImageReader.newInstance(1280, 720, ImageFormat.YV12, 2)
  private var running = false

  init {
    imageReader.setOnImageAvailableListener({
      Log.e("Test", "rendering????")
    }, Handler(mBackgroundThread.getLooper()))
  }
  override fun create(width: Int, height: Int, fps: Int): Boolean {
    this.width = width
    this.height = height
    this.fps = fps
    created = true
    lifecycleRegistry.handleLifecycleEvent(Lifecycle.Event.ON_CREATE)
    return true
  }

  override fun start(surfaceTexture: SurfaceTexture) {
    //val surface  = Surface(surfaceTexture)
    camera.openSession(lifecycleRegistry, IRICameraProfile.STREAMING) {
      it.addOutput(imageReader.surface)
      it.setFrameSize(width, height)
    }
    lifecycleRegistry.handleLifecycleEvent(Lifecycle.Event.ON_START)
    running = true;
  }

  override fun isRunning(): Boolean {
    return running
  }

  override fun release() {
    lifecycleRegistry.handleLifecycleEvent(Lifecycle.Event.ON_DESTROY)
  }

  override fun stop() {
    running = false
    lifecycleRegistry.handleLifecycleEvent(Lifecycle.Event.ON_STOP)
  }

  override val lifecycle: Lifecycle
    get() = lifecycleRegistry

}

According with the first post. This should work and write "rendering????" in logcat. If it is not the case then the problem is not related with the SurfaceTexture itself. I have no access to the documentation that maybe is useful to find something related with the classes and find other way to do it.

pedroSG94 commented 9 months ago

Another possible VideoSource to test is using Canvas:

class CanvasSource(imageReader: ImageReader): VideoSource() {

  private val paint = Paint()
  private val rect = Rect()
  private var running = false
  private var surface: Surface? = null
  private var bitmap: Bitmap? = null

  init {
    val handlerThread = HandlerThread("reader")
    handlerThread.start()
    imageReader.setOnImageAvailableListener({ reader ->
      var image: Image? = null
      try {
        image = reader.acquireLatestImage()
        if (image != null) {
          val buffer = image.planes[0].buffer
          render(buffer)
        }
      } finally {
        image?.close()
      }
    }, Handler(handlerThread.looper))
  }

  override fun create(width: Int, height: Int, fps: Int): Boolean {
    this.width = width
    this.height = height
    this.fps = fps
    bitmap = Bitmap.createBitmap(width, height, Bitmap.Config.ARGB_8888)
    rect.set(0, 0, width, height)
    created = true
    return true
  }

  override fun start(surfaceTexture: SurfaceTexture) {
    surface = Surface(surfaceTexture)
    running = true
  }

  override fun stop() {
    running = false
    surface?.release()
  }

  override fun release() {
    bitmap?.recycle()
    bitmap = null
  }

  override fun isRunning(): Boolean = running

  /**
   * You can call it from outside in the ImageReader listener using the first code example.
   * But remember remove listener inside this class if you want call it from outside.
   */
  fun render(byteBuffer: ByteBuffer) {
    bitmap?.let { bitmap ->
      if (bitmap.isRecycled || !running) return
      byteBuffer.rewind()
      bitmap.copyPixelsFromBuffer(byteBuffer)

      val canvas = if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.M) {
        surface?.lockHardwareCanvas() ?: return
      } else {
        surface?.lockCanvas(null) ?: return
      }
      canvas.drawBitmap(bitmap, null, rect, paint)
      surface?.unlockCanvasAndPost(canvas)
    }
  }
}

The problem with this solution could be the performance. If the resolution is 1080p or less with a decent CPU should be fine.

ronaldsampaio commented 9 months ago

I think we should focus in the first solution because if it is working with a ImageRender, it should work with a SurfaceTexture. We can check if it is still working with a ImageReader inside the VideoSource created to discard problems with this:

class IristickCameraSource(private val camera: IRICamera) : VideoSource(), LifecycleOwner {

  private val lifecycleRegistry = LifecycleRegistry(this)
  private var imageReader : ImageReader = ImageReader.newInstance(1280, 720, ImageFormat.YV12, 2)
  private var running = false

  init {
    imageReader.setOnImageAvailableListener({
      Log.e("Test", "rendering????")
    }, Handler(mBackgroundThread.getLooper()))
  }
  override fun create(width: Int, height: Int, fps: Int): Boolean {
    this.width = width
    this.height = height
    this.fps = fps
    created = true
    lifecycleRegistry.handleLifecycleEvent(Lifecycle.Event.ON_CREATE)
    return true
  }

  override fun start(surfaceTexture: SurfaceTexture) {
    //val surface  = Surface(surfaceTexture)
    camera.openSession(lifecycleRegistry, IRICameraProfile.STREAMING) {
      it.addOutput(imageReader.surface)
      it.setFrameSize(width, height)
    }
    lifecycleRegistry.handleLifecycleEvent(Lifecycle.Event.ON_START)
    running = true;
  }

  override fun isRunning(): Boolean {
    return running
  }

  override fun release() {
    lifecycleRegistry.handleLifecycleEvent(Lifecycle.Event.ON_DESTROY)
  }

  override fun stop() {
    running = false
    lifecycleRegistry.handleLifecycleEvent(Lifecycle.Event.ON_STOP)
  }

  override val lifecycle: Lifecycle
    get() = lifecycleRegistry

}

According with the first post. This should work and write "rendering????" in logcat. If it is not the case then the problem is not related with the SurfaceTexture itself. I have no access to the documentation that maybe is useful to find something related with the classes and find other way to do it.

So here "rendering???" is not beeing shown. For some reason, looks like that the camera.openSession(...) is not working insithe the IristickVideoSource, only outside of it. I mean, the camera isn't "turning on" correctly (no led showing activity). I'm investigating and will try to open the session outside and test the GlStreamingInterface solution. Thanks for the help. Will be giving some feedback soon.

ronaldsampaio commented 9 months ago

Was able to make it happen! The problem really was with the camera.openSession(...) which doesn't get opened by the start or create methods from IristickCameraSource. So I just opened it outside the video source and used the Surface created with surfaceTexture. Here the solutions

@SuppressLint("StaticFieldLeak")
@KoinViewModel
class MainPageViewModel(private val context: Context) : ViewModel(),IRIListener, ConnectChecker, ClientListener{
    private lateinit var rtspServerStream: RtspServerStream
    private val portNum = 18550
    private lateinit var  camera : IRICamera

    private val _isLoading = MutableStateFlow(true)
    val isLoading = _isLoading.asStateFlow()

    init {
        IristickSDK.registerListener(null,this)
    }

    @OptIn(Experimental::class) private fun startServer(){
        val iristickCameraSource = IristickCameraSource()
        rtspServerStream = RtspServerStream(context,portNum,this,iristickCameraSource,MicrophoneSource())
        val prepared = rtspServerStream.prepareVideo(1280, 720, 4000000) && rtspServerStream.prepareAudio(48000,false,128000)
        if (prepared){
            Log.d("Streaming","Trying to start STREAM...")
            rtspServerStream.startStream()

        }
        else Log.d("Streaming", "Server not prepared!")
        camera.openSession(iristickCameraSource.lifecycle, IRICameraProfile.STREAMING) {
            it.addOutput(iristickCameraSource.getSurface())
            it.setFrameSize(iristickCameraSource.width, iristickCameraSource.height)
            it.setFrameRate(iristickCameraSource.fps.toFloat())
            it.setOnReadyListener({Log.d("Streaming", "Streaming session opened")})
        }
        _isLoading.value = false
    }

    override fun onHeadsetAvailable(headset: IRIHeadset) {
        camera = headset.findCamera(IRICameraType.WIDE_ANGLE)!!
        if (camera!=null) startServer()
    }
}
class IristickCameraSource() : VideoSource(), LifecycleOwner {
    private val lifecycleRegistry = LifecycleRegistry(this)
    private var running = false
    private lateinit var surface : Surface

    fun getSurface() : Surface{
        return surface
    }

    override fun create(width: Int, height: Int, fps: Int): Boolean {
        this.width = width
        this.height = height
        this.fps = fps
        created = true
        lifecycleRegistry.handleLifecycleEvent(Lifecycle.Event.ON_CREATE)
        return true
    }

    override fun start(surfaceTexture: SurfaceTexture) {
        this.surfaceTexture = surfaceTexture
        surface  = Surface(surfaceTexture)
        lifecycleRegistry.handleLifecycleEvent(Lifecycle.Event.ON_START)
        running = true;
    }

    override fun isRunning(): Boolean {
        return running
    }

    override fun release() {
        lifecycleRegistry.handleLifecycleEvent(Lifecycle.Event.ON_DESTROY)
    }

    override fun stop() {
        running = false
        lifecycleRegistry.handleLifecycleEvent(Lifecycle.Event.ON_STOP)
    }

    override val lifecycle: Lifecycle
        get() = lifecycleRegistry

}

There are still some Lifecycle problems but I'll solve em later