pedroSG94 / RTSP-Server

Plugin of rtmp-rtsp-stream-client-java to stream directly to RTSP player.
Apache License 2.0
212 stars 65 forks source link

i can't this library #112

Closed yubinjin closed 8 months ago

yubinjin commented 9 months ago

i can't use this library at 1.2.0 version i use 1.8 jdk, ndk 21.0, gradle 6.9~ but this library is not supported Cannot resolve symbol 'ClientListener' Cannot resolve symbol 'RtspServerCamera1' Cannot resolve symbol 'ServerClient'

import com.pedro.rtspserver.ClientListener; import com.pedro.rtspserver.RtspServerCamera1; import com.pedro.rtspserver.ServerClient;

i must use jdk 1.8version becaue i using the library at usb-otg-camera project

pedroSG94 commented 9 months ago

Hello,

I checked the gradle and it is working properly. I tested creating a new project that target to java8 and the library compile. Did you check that you added jitpack maven correctly to your project?

yubinjin commented 9 months ago

yes i added this ... but i can't this library... thank you

pedroSG94 commented 9 months ago

Hello,

Please, if you have an active issue with the same error. Don't open other issue or I will close it as duplicated This is the post from the other issue:

i configured to way 
i use the usbcamera library on rootencoder project 
i used the library  that com.github.jiangdongguo:AndroidUSBCamera:2.3.4
but this happened jvm error 
how can i use the library on rootencoder??
1. First way 
  compileOptions {
    sourceCompatibility = JavaVersion.VERSION_1_8
    targetCompatibility = JavaVersion.VERSION_1_8
  }
kotlinOptions {
    jvmTarget = "8"
  }
  implementation("com.github.jiangdongguo:AndroidUSBCamera:2.3.4")
-settings gradle
@Suppress("UnstableApiUsage")
dependencyResolutionManagement {
  repositoriesMode.set(RepositoriesMode.FAIL_ON_PROJECT_REPOS)
  repositories {
    google()
    //jcenter()
    mavenCentral()
    maven { url = uri("https://raw.github.com/saki4510t/libcommon/master/repository/") }
    maven { url = uri("https://jitpack.io") }
  }
}
but this way happened error 
->Gradle JVM version incompatible.
This project is configured to use an older Gradle JVM that supports up to version 8 but the current AGP requires a Gradle JVM that supports version 17.
3.Second way
  compileOptions {
    sourceCompatibility = JavaVersion.VERSION_17
    targetCompatibility = JavaVersion.VERSION_17
  }
  kotlinOptions {
    jvmTarget = "17"
  }
this way happened the same error 
->Invalid Gradle JDK configuration found.
Use Embedded JDK (/opt/android-studio-2023.1.1/android-studio/jbr)
Change Gradle JDK location
how can i use the library???????? 

This error is not related with the library compilation. The problem is related with the JDK version used in your Android Studio. You will have this error if you use a new version of AGP and java 17 in your project no matter if you add my library as dependency or not because you IDE is using java8.

Did you try to change your JDK used by your IDE to a JDK of java 17? Try to do it (If I'm not wrong you can click on Use embedded JDK) or change it yourself: https://stackoverflow.com/a/30631386

yubinjin commented 8 months ago

thank you i have a question can i use the rootencoder project on jdk8 version???

yubinjin commented 8 months ago

I am trying to integrate the rootencoder project with the Android-USB-OTG-Camera project from https://github.com/quantum6/Android-USB-OTG-Camera. My idea is to transfer the Android-USB-OTG-Camera project into rootencoder. Despite trying various methods, I've encountered an issue where rootencoder uses JDK 17, while the Android-USB-OTG-Camera project uses JDK 8. Is there a way for me to add and use the Android-USB-OTG-Camera within the rootencoder project, considering the JDK version difference?

pedroSG94 commented 8 months ago

Hello,

Yes, you can do it (you can use USB library target to jdk8 in a project with jdk17). Again, the problem is that you are using jdk8 in Android studio so you can't compile any project using jdk17 because you need change it.

About USB, recently other user was able to implement an USB camera implementation. You have the post here: https://github.com/pedroSG94/RTSP-Server/issues/110 Read the post, you have a project example ready to use in a comment.

yubinjin commented 8 months ago

ohh thank you so much!!!!

yubinjin commented 8 months ago

need to modify the functionality in https://github.com/ronaldsampaio/USBStreaming so that instead of streaming to an RTSP server, I can input an RTSP endpoint URL directly in RootEncoder and stream to this URL. Additionally, the stream displayed by RootEncoder should come from a USB camera, not the built-in camera. I am in the process of writing code to implement these changes, but I keep encountering errors. Can you help me with this?

pedroSG94 commented 8 months ago

Hello,

Replace RtspServerStream to RtspStream is really easy. You only need replace RtspServerStream class to RtspStream class and add the url in startStream method.

pedroSG94 commented 8 months ago

I can't help you like that... What is not working? (preview no show, stream no start, did you check server side to know if something is received?, did you get an error relevant in logcat or a crash?, etc) Can you describe the error? Describe all with details not only the stream is not working

yubinjin commented 8 months ago

i'm really so much thank you !! i'm so success!!!! so much i'm so pleasure!

pedroSG94 commented 8 months ago

Closing as complete

yubinjin commented 8 months ago

Sure, here is the translation of your query into English:

Hello, it's been a while since my last visit. I came back because I have a question! With rootencoder, the camera stream appears full screen on the phone, but with usbcamera streaming, it does not fill the entire screen. Additionally, when the screen is rotated to landscape, the streaming stops! I'm not sure how to handle this phenomenon! I want it to appear in full screen, and when I rotate the phone to landscape, unlike the current situation where the streaming stops, I want the camera streaming to continue showing in full screen on the entire phone screen, similar to how rootencoder works. Do you know what might be causing these issues?

Feel free to adjust the translation as needed based on the specific context of your application or project.

this is my code @SuppressLint("UnusedMaterial3ScaffoldPaddingParameter") @OptIn(ExperimentalMaterial3Api::class, ExperimentalPermissionsApi::class) @Composable fun USBCameraScreen() { val cameraPermissionState: PermissionState = rememberPermissionState(Manifest.permission.CAMERA) val permissions = listOf( Manifest.permission.CAMERA, // Manifest.permission.WRITE_EXTERNAL_STORAGE, // Manifest.permission.READ_EXTERNAL_STORAGE, Manifest.permission.READ_MEDIA_AUDIO, Manifest.permission.READ_MEDIA_VIDEO, Manifest.permission.READ_MEDIA_IMAGES, Manifest.permission.ACCESS_NETWORK_STATE, Manifest.permission.ACCESS_WIFI_STATE, Manifest.permission.INTERNET, Manifest.permission.RECORD_AUDIO,

    )
val multiplePermissionsState = rememberMultiplePermissionsState(permissions)
if(multiplePermissionsState.allPermissionsGranted){
    BuildScreen(rememberCameraClient(context = LocalContext.current))
}else{
    NoPermissionScreen(multiplePermissionsState::launchMultiplePermissionRequest)
}

}

@Composable fun rememberCameraClient(context: Context): CameraClient = remember { CameraClient.newBuilder(context).apply { setEnableGLES(true) setCameraStrategy(CameraUvcStrategy(context)) setRawImage(false) setCameraRequest( CameraRequest.Builder() .setFrontCamera(false) .setPreviewWidth(1280) .setPreviewHeight(720) .create() ) openDebug(true)

}.build()

}

@OptIn(ExperimentalMaterial3Api::class) @SuppressLint("UnusedMaterial3ScaffoldPaddingParameter") @Composable fun BuildScreen( cameraClient : CameraClient, usbCameraViewModel: USBCameraViewModel = koinViewModel() ) { cameraClient.addPreviewDataCallBack(object : IPreviewDataCallBack { init { Log.d("camera_streaming","INSIDE PREVIEW CONSTUCTOR!") } override fun onPreviewData( data: ByteArray?, width: Int, height: Int, format: IPreviewDataCallBack.DataFormat ) { //Log.d("camera_streaming","PREVIEW DATA CALLBACK") }

})

Scaffold(
    modifier = Modifier.fillMaxSize(),
    topBar = {
        TopAppBar(
            title = { Text("USB_Streaming", color = Color.White) },
            colors = TopAppBarDefaults.topAppBarColors(
                containerColor = Color.Black // TopAppBar의 배경색 설정
            )
        )
    }
) { innerPadding ->
    Box(modifier = Modifier.fillMaxSize().padding(innerPadding)) {
        AndroidView(
            factory = { context ->
                TextureView(context).apply {
                    surfaceTextureListener = object : TextureView.SurfaceTextureListener {
                        val _tag = "camera_streaming"
                        override fun onSurfaceTextureAvailable(
                            surface: SurfaceTexture,
                            width: Int,
                            height: Int
                        ) {
                            usbCameraViewModel.startVideoStreaming(cameraClient, this@apply)
                        }

                        override fun onSurfaceTextureSizeChanged(
                            surface: SurfaceTexture,
                            width: Int,
                            height: Int
                        ) {
                            Log.d(_tag, "onSurfaceTextureSIZECHANGED")
                            cameraClient.setRenderSize(width, height) // Ensure full screen by using width and height from this callback
                        }

                        override fun onSurfaceTextureDestroyed(surface: SurfaceTexture): Boolean {
                            Log.d(_tag, "onSurfaceTextureDESTROYED")
                            cameraClient.closeCamera()
                            usbCameraViewModel.stopVideoStreaming()
                            return true
                        }

                        override fun onSurfaceTextureUpdated(surface: SurfaceTexture) {
                            //Log.d(_tag, "onSurfaceTextureUPDATED")
                        }
                    }
                }
            },
            modifier = Modifier.fillMaxSize() // Make sure the AndroidView fills the available space
        )
    }
}

}

fun captureImage(cameraClient: CameraClient, context: Context){

cameraClient.captureImage(object : ICaptureCallBack {
    override fun onBegin() {
        Toast.makeText(context, "onBegin", Toast.LENGTH_SHORT).show()
        Log.i("CameraClient", "onBegin")

    }

    override fun onError(error: String?) {
        Toast.makeText(context, "onError", Toast.LENGTH_SHORT).show()
        Log.i("CameraClient", "onError")
    }

    override fun onComplete(path: String?) {
        Toast.makeText(context, "onComplete", Toast.LENGTH_SHORT).show()
        ToastUtils.show("OnComplete")
        Log.i("CameraClient", "onComplete")
    }
})

} IMG_1272

pedroSG94 commented 8 months ago

Hello,

About the preview. It is because Camera APIs and USB camera library works different. With USB camera you can't do exactly the same that with a normal camera because the image will be distorted. Your result is the equivalent and it is working as expected. You can try stream in landscape to get a preview more filled because the aspect ratio is more similar in landscape. Or you can try use Fill instead of Adjust here: https://github.com/ronaldsampaio/USBStreaming/blob/master/app/src/main/java/com/aliger/usbstreaming/StreamingController.kt#L29 If you don't like anyone of that. You can select None and use a custom TextureView as the preview adjusting your view as you want. You have a custom TextureView here: https://github.com/pedroSG94/RootEncoder/blob/master/library/src/main/java/com/pedro/library/view/AutoFitTextureView.java

About stream after rotate the screen. You can't do it. It is the way that the USB camera library is working. The USB camera is binded to the activity lifecycle so this is the expected way. I recommend you block the activity orientation or modify the library as you need (I can't help you in this last).

yubinjin commented 8 months ago

humm...this is so difficult... i can't modify screen size.... but thank for your advise...!! Thank you so much!

yubinjin commented 6 months ago

help em plz.... this is my code /*

package com.aliger.usbstreaming.camera;

import android.Manifest; import android.content.Context; import android.content.pm.ActivityInfo; import android.content.pm.PackageManager; import android.os.Bundle; import android.telephony.TelephonyManager; import android.util.Log; import android.view.SurfaceHolder; import android.view.SurfaceView; import android.view.View; import android.view.ViewGroup; import android.view.WindowManager; import android.webkit.WebView; import android.widget.Button; import android.widget.ImageButton; import android.widget.ImageView; import android.widget.TextView; import android.widget.Toast;

import androidx.annotation.NonNull; import androidx.appcompat.app.AppCompatActivity; import androidx.core.content.ContextCompat;

import com.aliger.usbstreaming.MyFusedLocationProvider; import com.aliger.usbstreaming.R; import com.jiangdg.ausbc.CameraClient; import com.jiangdg.ausbc.camera.CameraUvcStrategy; import com.jiangdg.ausbc.camera.bean.CameraRequest; import com.pedro.common.ConnectChecker; import com.pedro.encoder.input.video.CameraOpenException; import com.pedro.library.rtsp.RtspCamera1;

import org.jetbrains.annotations.NotNull;

import java.io.File;

/**

pedroSG94 commented 6 months ago

Hello, In this method: (rtspCamera1.getStreamClient().reTry(50, reason, null)

You can set a URL to connect instead of set null. Null is reconnect with the original URL or you can set other URL in case you want a backup server

yubinjin commented 6 months ago

/*

package com.aliger.usbstreaming.camera;

import android.Manifest; import android.content.Context; import android.content.pm.ActivityInfo; import android.content.pm.PackageManager; import android.os.Bundle; import android.telephony.TelephonyManager; import android.util.Log; import android.view.SurfaceHolder; import android.view.SurfaceView; import android.view.View; import android.view.ViewGroup; import android.view.WindowManager; import android.webkit.WebView; import android.widget.Button; import android.widget.ImageButton; import android.widget.ImageView; import android.widget.TextView; import android.widget.Toast;

import androidx.annotation.NonNull; import androidx.appcompat.app.AppCompatActivity; import androidx.core.content.ContextCompat;

import com.aliger.usbstreaming.MyFusedLocationProvider; import com.aliger.usbstreaming.R; import com.jiangdg.ausbc.CameraClient; import com.jiangdg.ausbc.camera.CameraUvcStrategy; import com.jiangdg.ausbc.camera.bean.CameraRequest; import com.pedro.common.ConnectChecker; import com.pedro.encoder.input.video.CameraOpenException; import com.pedro.library.rtsp.RtspCamera1;

import org.jetbrains.annotations.NotNull;

import java.io.File;

/**

yubinjin commented 6 months ago

i have a one more question package com.aliger.usbstreaming

import android.Manifest import android.content.Context import android.content.pm.PackageManager import android.telephony.TelephonyManager import android.util.Log import android.view.SurfaceView import android.view.TextureView import android.widget.Toast import androidx.core.content.ContextCompat import com.jiangdg.ausbc.CameraClient import com.pedro.common.ConnectChecker import com.pedro.encoder.utils.gl.AspectRatioMode import com.pedro.library.rtsp.RtspStream import com.pedro.library.util.sources.audio.MicrophoneSource import com.pedro.library.view.OrientationForced import com.pedro.rtspserver.ClientListener import com.pedro.rtspserver.ServerClient import org.koin.core.annotation.Factory import java.nio.ByteBuffer

@Factory class StreamingController(private val context: Context) : ClientListener, ConnectChecker { private lateinit var rtspStream : RtspStream private var myFusedLocationProvider: MyFusedLocationProvider = MyFusedLocationProvider(context)

/*private var portNum = 18554*/
private var prepared = false

// 기존 코드는 유지하고 스트리밍 상태 확인 메소드 추가

fun setUpServer(cameraClient: CameraClient) {
    rtspStream = RtspStream(
        context,
        this,
        USBCameraSource(context,cameraClient),
        MicrophoneSource()
    )
    rtspStream.getGlInterface().setAspectRatioMode(AspectRatioMode.Adjust)
    rtspStream.getGlInterface().forceOrientation(OrientationForced.LANDSCAPE)
    //rtspStream.getGlInterface().setPreviewResolution(200, 200)
    prepared = rtspStream.prepareVideo(1280, 720, 4000000) && rtspStream.prepareAudio(48000,false,128000)
    //prepared = rtspStream.prepareVideo(1920, 1080, 4000000) && rtspStream.prepareAudio(48000, false, 128000)

    //rtspStream.changeVideoSource(usbCameraSource)
}

fun startStream() {
    if (::rtspStream.isInitialized && prepared) {
        val telephonyManager = context.getSystemService(Context.TELEPHONY_SERVICE) as? TelephonyManager
        var streamName = getStreamName() // 기본 스트림 이름

/ // READ_PHONE_STATE 권한 확인 if (ContextCompat.checkSelfPermission(context, Manifest.permission.READ_PHONE_STATE) == PackageManager.PERMISSION_GRANTED) { val phoneNumber = telephonyManager?.line1Number // 전화번호를 성공적으로 가져왔다면 스트림 이름을 전화번호로 설정 phoneNumber?.let { if (it.startsWith("+8210")) { streamName = it.replace("+8210", "010") // 국가 코드 +82를 010으로 변환 } else { streamName = it } } }/

        // 스트림 시작
        rtspStream.startStream("rtsp://192.168.0.9:8555/$streamName")
        Log.d("Streaming", "Stream started: rtsp://192.168.0.9:8554/$streamName")
        myFusedLocationProvider.setStreamName(streamName)

        myFusedLocationProvider.requestLocationUpdates()

        //myFusedLocationProvider.requestLocationUpdates()

    } else {
        Log.d("Streaming", "rtspStream has not been initialized or not prepared")
        //Toast.makeText(context, "rtspStream has not been initialized or not prepared", Toast.LENGTH_LONG).show()
    }
}
fun changeVideoSource(usbCameraSource: USBCameraSource) {
    //rtspStream.changeVideoSource(usbCameraSource)
    rtspStream.changeVideoSource(usbCameraSource)
}
private fun getStreamName(): String {
    val telephonyManager = context.getSystemService(Context.TELEPHONY_SERVICE) as? TelephonyManager
    if (ContextCompat.checkSelfPermission(context, Manifest.permission.READ_PHONE_STATE) == PackageManager.PERMISSION_GRANTED) {
        val phoneNumber = telephonyManager?.line1Number
        return phoneNumber?.let {
            if (it.startsWith("+8210")) it.replace("+8210", "010") else it
        } ?: "mystream2"  // Default stream name if no phone number
    }
    return "mystream2"  // Default stream name if permission not granted
}

private fun decodeSpsPpsFromBuffer(
    outputBuffer: ByteBuffer,
    length: Int
): Pair<ByteBuffer, ByteBuffer>? {
    val csd = ByteArray(length)
    outputBuffer[csd, 0, length]
    var i = 0
    var spsIndex = -1
    var ppsIndex = -1
    while (i < length - 4) {
        if (csd[i].toInt() == 0 && csd[i + 1].toInt() == 0 && csd[i + 2].toInt() == 0 && csd[i + 3].toInt() == 1) {
            if (spsIndex == -1) {
                spsIndex = i
            } else {
                ppsIndex = i
                break
            }
        }
        i++
    }
    if (spsIndex != -1 && ppsIndex != -1) {
        val sps = ByteArray(ppsIndex)
        System.arraycopy(csd, spsIndex, sps, 0, ppsIndex)
        val pps = ByteArray(length - ppsIndex)
        System.arraycopy(csd, ppsIndex, pps, 0, length - ppsIndex)
        return Pair(ByteBuffer.wrap(sps), ByteBuffer.wrap(pps))
    }
    return null
}

fun startPreview(textureView: TextureView){
    if (prepared && !rtspStream.isOnPreview) {
        rtspStream.startPreview(textureView)
        myFusedLocationProvider.requestLocationUpdates()}

}

fun startPreview(surfaceView: SurfaceView) {
    if (prepared && !rtspStream.isOnPreview) rtspStream.startPreview(surfaceView)
}

fun stopStream(){
    if (rtspStream.isStreaming) {
        rtspStream.stopStream()
        myFusedLocationProvider.removeLocationUpdates()
    }
}

fun stopPreview() {
    if (rtspStream.isOnPreview) rtspStream.stopPreview()
}

fun release() {
    rtspStream.release()
}
init {
    // MyFusedLocationProvider 인스턴스 생성
}
//FROM ClientListener()
override fun onClientConnected(client: ServerClient) {
    Log.d("Streaming","Client Connected: ${client.name}")

}

override fun onClientDisconnected(client: ServerClient) {
    Log.d("Streaming","Client Disconnected: ${client.name}")
}

//FROM ConnectChecker()
override fun onAuthError() {
    Log.e("Streaming","Auth ERROR!")
}

override fun onAuthSuccess() {
    Log.d("Streaming","Auth Success")
}

override fun onConnectionFailed(reason: String) {
    Log.e("Streaming", "Connection failed: $reason")
    val streamName = getStreamName()
    if (rtspStream.getStreamClient()
            .reTry(5000, reason, "rtsp://192.168.0.9:8554/$streamName")
    ) {

        Toast.makeText(context, "Retrying with secondary server: $streamName", Toast.LENGTH_SHORT).show()
        myFusedLocationProvider.setStreamName(streamName)
        myFusedLocationProvider.requestLocationUpdates()
    } else {
        Toast.makeText(context, "Connection permanently failed: $reason", Toast.LENGTH_SHORT).show()
        myFusedLocationProvider.removeLocationUpdates()
        rtspStream.stopStream()
    }
}

override fun onConnectionStarted(url: String) {
    Log.d("Streaming","Connection Started with: $url")
}

override fun onConnectionSuccess() {
    Log.d("Streaming","Connection Successful!")
}

override fun onDisconnect() {
    Log.d("Streaming","onDisconnect!")
}

override fun onNewBitrate(bitrate: Long) {
}

} is this code same too??

yubinjin commented 6 months ago

plz....help me plz.... i don't know this...

yubinjin commented 6 months ago

"I am currently streaming from Server 1. If the stream gets interrupted due to some issue, I want to retry connecting to Server 1 first, and if that doesn't work, I want to try connecting to Server 2. Additionally, I want the streaming to continue in the background without being destroyed by incoming calls or messages, even if I haven't pressed the stop streaming button. How can I modify the code to achieve this?" /*

package com.aliger.usbstreaming.camera;

import android.Manifest; import android.content.Context; import android.content.pm.ActivityInfo; import android.content.pm.PackageManager; import android.os.Bundle; import android.os.Handler; import android.telephony.TelephonyManager; import android.util.Log; import android.view.SurfaceHolder; import android.view.SurfaceView; import android.view.View; import android.view.ViewGroup; import android.view.WindowManager; import android.webkit.WebView; import android.widget.Button; import android.widget.ImageButton; import android.widget.ImageView; import android.widget.TextView; import android.widget.Toast;

import androidx.annotation.NonNull; import androidx.appcompat.app.AppCompatActivity; import androidx.core.content.ContextCompat;

import com.aliger.usbstreaming.MyFusedLocationProvider; import com.aliger.usbstreaming.R; import com.jiangdg.ausbc.CameraClient; import com.jiangdg.ausbc.camera.CameraUvcStrategy; import com.jiangdg.ausbc.camera.bean.CameraRequest; import com.pedro.common.ConnectChecker; import com.pedro.encoder.input.video.CameraOpenException; import com.pedro.library.rtsp.RtspCamera1; import com.pedro.rtsp.rtsp.Protocol;

import org.jetbrains.annotations.NotNull;

import java.io.File;

/**

yubinjin commented 6 months ago

The second question is, can't I run two rtspCamera1.getStreamClient().reTry like the code above? Then, like the code above, why does it go through the if statement of rtspCamera1.getStreamClient().reTry and not the rtspCamera1.getStreamClient().reTry code of if in the else? Where does it go once rtspCamera1.getStreamClient().reTry is finished and the connection is established? Where do I go when I can't connect?