Closed laalex closed 8 years ago
With the above code, I only get:
11-28 23:19:54.021 19162-19162/com.codesilk.backgroundservicertmp D/cdsk: VideoStreamServiceCreated
11-28 23:19:59.301 19162-19162/com.codesilk.backgroundservicertmp D/cdsk: RTMP connecting
11-28 23:19:59.401 19162-19162/com.codesilk.backgroundservicertmp D/cdsk: RTMP connected
There is nothing streamed to the server :). The RTMP connecting, and RTMP connected logs are from the onRtmpConnecting
and onRtmpConnected
functions. Any other callbacks are not called for some reason.
Updates. It seems that the mPublisher.setRecordHandler
was not in my code, so I have updated my code as follows:
// Preview started - we need to start streaming
mPublisher = new SrsPublisher(new SrsCameraView(this));
mPublisher.setEncodeHandler(new SrsEncodeHandler(this));
mPublisher.setRtmpHandler(new RtmpHandler(this));
mPublisher.setRecordHandler(new SrsRecordHandler(this));
// Set preview resolution
mPublisher.setOutputResolution(320, 240);
mPublisher.setVideoSmoothMode();
mPublisher.startPublish("rtmp://192.168.1.56:1935/live/stream");
Now, from the mPublisher I have removed the setPreviewSize
as I was getting an error on a Samsung Galaxy S6 Edge that the setParameters
to the camera in the SrsCameraView.java
. So I have bypassed this by adding the following in the SrsCameraView.java
:
...
List<Camera.Size> mCameraSizes = params.getSupportedPreviewSizes();
Camera.Size mCameraSize = mCameraSizes.get(mCameraSizes.size() -1);
params.setPreviewSize(mCameraSize.width, mCameraSize.height);
...
By doing those changes, my background service runs pretty good and the RTMP stream is sent to the server. What happens now is the fact that only audio is streamed. The video is not streamed in any way and at this point I am blocked 😞
EDIT: I have tried different changes for the mCameraSizes
above. As you can see in the code I was getting the last camera size. It does the same thing with any index from the camera preview sizes.
As far as I have seen in the SrsCameraView.java
it seems that an interface is needed in the application in order to get camera preview. I concluded this from the following lines of code:
...
try {
mCamera.setPreviewDisplay(getHolder());
} catch (IOException e) {
e.printStackTrace();
}
mCamera.startPreview();
...
From my research about having a camera preview in the background it seems that the only working solution would be to use a SurfaceTexture
instead of a SurfaceHolder
.
This means that for a background service I would need to change the entire SrsCameraView.java
to use a SurfaceTexture
instead of a SurfaceHolder
?
I am asking this as I am a web developer and not an Android developer and I would like to know if there is any easier solution. If not, the way to go seems to be the SurfaceTexture for me :)
In fact, the SurfaceTexture
as a camera display holder is used in gpuimage branch.
Even though, I do not think you can stream video from background service since the onPause
method in MainActivity
has been invoked. I suggest you to look at the life cycle of the Activity.
I will check that branch and see if it is working. I know that the surface holder gets removed when the activity goes on pause or is killed. But it seems that people are getting the camera preview în background services to do recording by using the surface texture.
I will give it a try and let You know.
Thanks
@begeekmyfriend, I have managed to do the following so far:
public class SrsCameraView extends SurfaceTexture implements Camera.PreviewCallback {
startCamera
function I have added the following code to bind the camera to the SurfaceTexture
:...
try {
Log.d("cdsk", "setting preview texture");
mCamera.setPreviewTexture(this);
mCamera.setPreviewCallback(new Camera.PreviewCallback() {
@Override
public void onPreviewFrame(byte[] bytes, Camera camera) {
Log.d("cdsk", "OnPreviewFrame called from texture");
mPrevCb.onGetYuvFrame(bytes);
camera.addCallbackBuffer(mYuvPreviewFrame);
}
});
} catch (IOException e) {
e.printStackTrace();
}
Log.d("cdsk", "starting cam preview");
mCamera.startPreview();
...
This seems to be good. Right now it seems that on the SrsPublisher
object, on the onGetYuvFrame
callback I get the data. The only problem is now that I get a segmentation fault:
11-29 16:09:55.077 31398-31398/com.codesilk.backgroundservicertmp A/libc: Fatal signal 11 (SIGSEGV), code 1, fault addr 0xdf29c000 in tid 31398 (oundservicertmp)
And also a message that says E/Camera-JNI: Null byte array!
Do you have any idea on this?
UPDATE: Here is some more info from the monitor:
11-29 16:09:55.127 27579-27579/? A/DEBUG: pid: 31398, tid: 31398, name: oundservicertmp >>> com.codesilk.backgroundservicertmp <<<
11-29 16:09:55.177 27579-27579/? A/DEBUG: #00 pc 00021920 /data/app/com.codesilk.backgroundservicertmp-1/lib/arm/libyuv.so (TransposeWx8_NEON+47)
11-29 16:09:55.177 27579-27579/? A/DEBUG: #01 pc 00013839 /data/app/com.codesilk.backgroundservicertmp-1/lib/arm/libyuv.so (TransposePlane+108)
11-29 16:09:55.177 27579-27579/? A/DEBUG: #02 pc 00013ee5 /data/app/com.codesilk.backgroundservicertmp-1/lib/arm/libyuv.so (NV12ToI420Rotate+252)
11-29 16:09:55.177 27579-27579/? A/DEBUG: #03 pc 00010659 /data/app/com.codesilk.backgroundservicertmp-1/lib/arm/libyuv.so (ConvertToI420+1710)
11-29 16:09:55.177 27579-27579/? A/DEBUG: #04 pc 00008d99 /data/app/com.codesilk.backgroundservicertmp-1/lib/arm/libenc.so
11-29 16:09:55.177 27579-27579/? A/DEBUG: #05 pc 000087c1 /data/app/com.codesilk.backgroundservicertmp-1/lib/arm/libenc.so
11-29 16:09:55.177 27579-27579/? A/DEBUG: #06 pc 0054b50f /data/app/com.codesilk.backgroundservicertmp-1/oat/arm/base.odex (offset 0x39c000) (byte[] net.ossrs.yasea.SrsEncoder.NV21ToNV12(byte[], int, int, boolean, int)+130)
11-29 16:09:55.177 27579-27579/? A/DEBUG: #07 pc 0054c97d /data/app/com.codesilk.backgroundservicertmp-1/oat/arm/base.odex (offset 0x39c000) (byte[] net.ossrs.yasea.SrsEncoder.hwPortraitYuvFrame(byte[])+440)
11-29 16:09:55.177 27579-27579/? A/DEBUG: #08 pc 0054e1b3 /data/app/com.codesilk.backgroundservicertmp-1/oat/arm/base.odex (offset 0x39c000) (void net.ossrs.yasea.SrsEncoder.onGetYuvFrame(byte[])+630)
11-29 16:09:55.177 27579-27579/? A/DEBUG: #09 pc 006a10cd /data/app/com.codesilk.backgroundservicertmp-1/oat/arm/base.odex (offset 0x39c000) (void net.ossrs.yasea.SrsPublisher$1.onGetYuvFrame(byte[])+792)
11-29 16:09:55.177 27579-27579/? A/DEBUG: #10 pc 0054903d /data/app/com.codesilk.backgroundservicertmp-1/oat/arm/base.odex (offset 0x39c000) (void net.ossrs.yasea.SrsCameraView$1.onPreviewFrame(byte[], android.hardware.Camera)+240)
Why using setPreviewCallback
? Why not using gpuimage
branch directly? I think you had better read Android SDK manuals before showing me what you did.
Say it again, I do not think it worth spending time on it. Have you searched on the Internet for the feasibility of your idea?
I have indeed searched and saw people were able to record video in services. Also I have an app on my phone that is able to do RTMP stream în a background service. That is Why i am sure it works somehow :)
The app is called BitStream. I open the app, start streaming and then I close all the app on the phone. The service is still available (showing a notification în the head bar) and continue to do the stream. I want to achieve the same thing the only difference is that I don't want to start the activity, but only start a service at device boot using a BroadcastReceiver for the boot intent.
All right. But I would like to remind you that YUV processing has been removed from this branch in commit https://github.com/begeekmyfriend/yasea/commit/3700fc7fbe557c6a4c5961ac92b82001eebf52b4 since only RGBA format is needed. Moreover camera preview callback has also been removed in commit https://github.com/begeekmyfriend/yasea/commit/28ec5f2a85c25b7ea1d9cebb408afe7fb4820c03
I see. Well, I finally understood that I cannot go the way I wanted in some way so what I managed to do is to start an activity from the background service where I get the SurfaceHolder needed to do the stream.
@laalex Can you please share your sample working code for me as I am also a web developer and figuring out a way to stream video from background service. I am using Live Hidden Camera Library. Its using combination of yasea and android hidden camera library but the LiveHiddenCamera Library is not working properly while its author is not replying to the issue and no matter how hard I try I can only stream audio. I get this in my log cat: 01-06 05:28:19.178 6738-6807/? E/ACodec: [OMX.qcom.video.encoder.avc] storeMetaDataInBuffers (output) failed w/ err -1010
.
I am trying to stream from a background service. What I find right now hard to do is to have a SrsCameraView displayed somehow. I know I might be able to get the camera contents from a SurfaceTexture but I don't know how to stream from there with your library.
My implementation that I've tried so far (and that doesn't work) is:
Can you please give me a hint or tell me if it's possible to do this from a background service?