Closed adelhoss closed 7 years ago
i can change replace my surface view with surface viw camera ? possible ?
Do you want stream your surface that is draw for you? You have 2 ways: 1) Send nv21 frames all time to the VideoEncoder (Encoding buffer to buffer mode).
2) You need render the inputsurface created with the VideoEncoder (Encoding surface to buffer mode).
The Surface from SurfaceView has only visual purpose. Camera1Base with surfaceview or textureview use buffer to buffer getting nv21 frames from the camera1 api callback. Camera2Base with surface or textureview use surface to buffer because camera2 api render the surface from the inputsurface and the surface from surfaceview or textureview (The second is only for visual purpose, that is the reason because you can stream without preview implemented too in this library if you use a context instance of a view in the constructor)
thank you very much this picture is show my project structure i have one SurfaceView on Activity and I want to Directly Streamed with use of your library Even I Use RtspDisplay Your Class to Record all activity and send stream to wowza server BUT immediately Application Run Throw Exception and app crashed in my code only use of your example rtsp display picture 1:(rtspdisplay Project) http://www.mediafire.com/convkey/02f3/8k5sgjm40jkh63izg.jpg?size_id=6 Picture2:(Project Structure) http://www.mediafire.com/convkey/ada6/ub44ir5ib43n9efzg.jpg link of my code : https://www.mediafire.com/file/klp87dfuc6r2jwd/rtspdisplay.txt please example of your answer in android programming . my goal : Send send stream MJPEG from IP CAMERA to wowza use of your library two q: q1 : i can stream directly surfaceView and send to server use of your library ? q2 : i can get stream from ipcamera MJPEG and send restream to wowza server by your library ?
How Send SurfaceView(android Default SurfaceView) to Camera1Base and get nv21Frame then stream to server ? please use example of get nv21 frame from surface view instead of nv21frame camera
you right : I do not need to preview ... I Want Only Send Mjpeg recived from Ip Camera to server ... First : i get of stream of ip camera directly in surface view Second : Strwam Surface View to server I know myslef harder my work .... but really really need use of your library to stream to server very thxxxx
this is status of my wowza server http://www.mediafire.com/convkey/15a7/p5cls3k86khow16zg.jpg no any connection of any type of stream ...
With display mode you will send the screen of your phone not only a surfaceview. Anyway if you want test display mode first use my app to make sure that all is ok and then post me log if something crash or fail.
Your rtsp url is bad.
rtsp://http://192.168.1.3:8088/live/android_test
Should be:
rtsp://192.168.1.3:8088/live/android_test
Wowza by default use port 1935 not 8088 so maybe that is bad too. In this case:
rtsp://192.168.1.3:1935/live/android_test
One question, url opened with mjpegview can be open with VLC app?
thxxx i Alter port number and change to 1935 but rtsp display is error now sending report error in android studio in next post
yes ipcamera with url : http://192.168.43.227/media/?action=stream ok and get stream without any problem second goal send this stream to wowza by your library but before this i needed use recorded all screen and send to wowza
I Uplodade logcat Android when i use your library for rtspdisplay: https://www.mediafire.com/file/tmzzopk4hk45dar/android%20logcat.txt i do not aded any code to your Example only copy paste your code in my project without any change.
i alter url and port of rtsp but error and crash
This look that you canceled windows permission required to record (A dialog that pop up). Replace:
@Override
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
if (requestCode == REQUEST_CODE) {
if (rtspDisplay.prepareAudio() && rtspDisplay.prepareVideo()) {
if (Build.VERSION.SDK_INT >= 21) {
rtspDisplay.setAuthorization("adel","eng123456");
rtspDisplay.startStream("rtsp://http://192.168.1.3:8088/live/android_test", resultCode, data);
}
} else {
Toast.makeText(this, "Error preparing stream, This device cant do it", Toast.LENGTH_SHORT)
.show();
}
}
}
To:
@Override
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
if (requestCode != Activity.RESULT_CANCELED) {
if (requestCode == REQUEST_CODE) {
if (rtspDisplay.prepareAudio() && rtspDisplay.prepareVideo()) {
if (Build.VERSION.SDK_INT >= 21) {
rtspDisplay.setAuthorization("adel", "eng123456");
rtspDisplay.startStream("rtsp://http://192.168.1.3:8088/live/android_test", resultCode,
data);
}
} else {
Toast.makeText(this, "Error preparing stream, This device cant do it", Toast.LENGTH_SHORT)
.show();
}
}
} else {
Toast.makeText(this, "Permissions denied, You need accept permissions to stream", Toast.LENGTH_SHORT).show();
}
}
If you see the toast It is because you canceled the pop up. If you can't see any pop up uninstall the app totally before run.
no i allowed to app recorded screen .. i tested on my phone galaxy a5 i tested on emulator . retest with this code .
I replace my code of method onActivityResult by my method but no change crashes . after show popup and i click on startnow app crashed
I clear my cache phone and re test no change i uinstall app restart my phone and re install app BUT NO CHANGE i clicked on started now app crash with same error sent Previous post i do not see any popup.
Same error if you do with my app? Did you tried with other phone?
please wait until compile your example and testing ...
If my app fail try with this activity:
package com.pedro.rtmpstreamer;
import android.app.Activity;
import android.content.Intent;
import android.media.projection.MediaProjection;
import android.media.projection.MediaProjectionManager;
import android.support.v7.app.AppCompatActivity;
import android.os.Bundle;
import android.view.SurfaceHolder;
import android.view.SurfaceView;
import android.widget.Toast;
import static java.security.AccessController.getContext;
public class Main2Activity extends AppCompatActivity {
private MediaProjectionManager mediaProjectionManager;
private SurfaceView surfaceView;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main2);
mediaProjectionManager = (MediaProjectionManager) getSystemService(MEDIA_PROJECTION_SERVICE);
surfaceView = (SurfaceView) findViewById(R.id.sv);
surfaceView.getHolder().addCallback(new SurfaceHolder.Callback() {
@Override
public void surfaceCreated(SurfaceHolder surfaceHolder) {
startActivityForResult(mediaProjectionManager.createScreenCaptureIntent(), 179);
}
@Override
public void surfaceChanged(SurfaceHolder surfaceHolder, int i, int i1, int i2) {
}
@Override
public void surfaceDestroyed(SurfaceHolder surfaceHolder) {
}
});
}
@Override
public void onActivityResult(int requestCode, int resultCode, Intent data) {
if (requestCode == 179 && resultCode == Activity.RESULT_OK) {
MediaProjection mediaProjection = mediaProjectionManager.getMediaProjection(resultCode, data);
mediaProjection.createVirtualDisplay("Stream Display", surfaceView.getWidth(),
surfaceView.getHeight(), 320, 0, surfaceView.getHolder().getSurface(), null, null);
} else {
Toast.makeText(this, "Permissions no available", Toast.LENGTH_SHORT).show();
}
}
}
<?xml version="1.0" encoding="utf-8"?>
<RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:tools="http://schemas.android.com/tools"
xmlns:app="http://schemas.android.com/apk/res-auto"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:padding="16dp"
tools:context="com.pedro.rtmpstreamer.Main2Activity"
>
<SurfaceView
android:layout_width="match_parent"
android:layout_height="match_parent"
android:id="@+id/sv"
/>
</RelativeLayout>
Yes Same Error with your app ... step by step open your project in android studio http://www.mediafire.com/convkey/2cb0/5mw36wt3eidpud1zg.jpg compile and build (create apk) http://www.mediafire.com/convkey/6d1e/rck6ug1mvyuzk6vzg.jpg run your app in emulator http://www.mediafire.com/convkey/1284/zcjdheiyvwy1e75zg.jpg click on rtsp display button http://www.mediafire.com/convkey/22fe/zkn49bmbaudccxezg.jpg and then enter my rtsp url and crash app http://www.mediafire.com/convkey/3b52/ccdvtkmlq1m4bm6zg.jpg
this activity WORK. BUT Window Inside Windows Inside Windows Inside Windows Inside Windows .... this screen shot of this activity http://www.mediafire.com/convkey/a456/9bqvzf4d1uiazj5zg.jpg Q:why windows inside windows ? i need one windows only THEN STREAMED TO SERVER BY YOUR LIBRARY
If that activity work then I need know which variable setted to mediaProjection.createVirtualDisplay is null Can you go to startStream method and debug that for me?
mediaProjection = mediaProjectionManager.getMediaProjection(resultCode, data);
mediaProjection.createVirtualDisplay("Stream Display", videoEncoder.getWidth(),
videoEncoder.getHeight(), dpi, 0, videoEncoder.getInputSurface(), null, null);
Check mediaprojection null before second line and all variables setted to both lines Reason windows is because you draw in the surface the screen included the surfaceview too that cause a recursive windows effect.
Yes Sure . Now Check which variable is null ... How Disable Recursive Windows Effect ? I must Disable Windows Recursive Effect Then Stream it Video Stream Contain Windows Effect ?
No because you never draw a surfaceview. This stream is totally background without preview.
good . this picture of startStream Method the constructor of DisplayBase initialize mediaprojection http://www.mediafire.com/convkey/06a5/i9gnihq9i97saxizg.jpg and this picture of StartStream Recoded For Debug (check with if ): http://www.mediafire.com/convkey/2fd0/by8nc6tzvp830iizg.jpg and MediaProjection is NULL: http://www.mediafire.com/convkey/2759/bea9cejco7f97bhzg.jpg Finally recrash app...though use boolean variable and check then run code recash app
possible use last activity that work in your library ? use last activity to screen display and send to server
replace activity in my project with:
package com.pedro.rtmpstreamer.displayexample;
import android.app.Activity;
import android.content.Intent;
import android.os.Build;
import android.os.Bundle;
import android.support.annotation.RequiresApi;
import android.support.v7.app.AppCompatActivity;
import android.view.View;
import android.view.WindowManager;
import android.widget.Button;
import android.widget.EditText;
import android.widget.Toast;
import com.pedro.rtplibrary.rtsp.RtspDisplay;
import com.pedro.rtmpstreamer.R;
import com.pedro.rtmpstreamer.constants.Constants;
import com.pedro.rtsp.rtsp.Protocol;
import com.pedro.rtsp.utils.ConnectCheckerRtsp;
@RequiresApi(api = Build.VERSION_CODES.LOLLIPOP)
public class DisplayRtspActivity extends AppCompatActivity
implements ConnectCheckerRtsp, View.OnClickListener {
private RtspDisplay rtspDisplay;
private Button button;
private EditText etUrl;
private final int REQUEST_CODE = 179;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
getWindow().addFlags(WindowManager.LayoutParams.FLAG_KEEP_SCREEN_ON);
setContentView(R.layout.activity_example);
button = (Button) findViewById(R.id.b_start_stop);
button.setOnClickListener(this);
etUrl = (EditText) findViewById(R.id.et_rtp_url);
etUrl.setHint(R.string.hint_rtsp);
rtspDisplay = new RtspDisplay(this, Protocol.TCP, this);
}
@Override
public void onConnectionSuccessRtsp() {
runOnUiThread(new Runnable() {
@Override
public void run() {
Toast.makeText(DisplayRtspActivity.this, "Connection success", Toast.LENGTH_SHORT).show();
}
});
}
@Override
public void onConnectionFailedRtsp() {
runOnUiThread(new Runnable() {
@Override
public void run() {
Toast.makeText(DisplayRtspActivity.this, "Connection failed", Toast.LENGTH_SHORT).show();
rtspDisplay.stopStream();
button.setText(R.string.start_button);
}
});
}
@Override
public void onDisconnectRtsp() {
runOnUiThread(new Runnable() {
@Override
public void run() {
Toast.makeText(DisplayRtspActivity.this, "Disconnected", Toast.LENGTH_SHORT).show();
}
});
}
@Override
public void onAuthErrorRtsp() {
runOnUiThread(new Runnable() {
@Override
public void run() {
Toast.makeText(DisplayRtspActivity.this, "Auth error", Toast.LENGTH_SHORT).show();
}
});
}
@Override
public void onAuthSuccessRtsp() {
runOnUiThread(new Runnable() {
@Override
public void run() {
Toast.makeText(DisplayRtspActivity.this, "Auth success", Toast.LENGTH_SHORT).show();
}
});
}
@Override
public void onActivityResult(int requestCode, int resultCode, Intent data) {
if (requestCode == REQUEST_CODE && resultCode == Activity.RESULT_OK) {
if (rtspDisplay.prepareAudio() && rtspDisplay.prepareVideo()) {
if (Build.VERSION.SDK_INT >= 21) {
rtspDisplay.startStream(etUrl.getText().toString(), resultCode, data);
}
} else {
Toast.makeText(this, "Error preparing stream, This device cant do it", Toast.LENGTH_SHORT)
.show();
}
} else {
Toast.makeText(this, "Permissions no available", Toast.LENGTH_SHORT).show();
}
}
@Override
public void onClick(View view) {
if (!rtspDisplay.isStreaming()) {
button.setText(R.string.stop_button);
startActivityForResult(rtspDisplay.sendIntent(), REQUEST_CODE);
} else {
button.setText(R.string.start_button);
rtspDisplay.stopStream();
}
}
}
I try emulate last activity changes and found the error.
After Replace This Activity With Before Activity App No Crash BUT I Touch On StartNow Toast Show Permission not Available .. pic one http://www.mediafire.com/convkey/8d6d/n2vn344sq5l4vr6zg.jpg pic two http://www.mediafire.com/convkey/8c77/2fhh1lc4eoq4hl3zg.jpg
I tested On Emulator And My phone i touch on StartNow but Toast Show "Permission Not Available"
I created a emulator with android studio nexus5X image api24 arquitecture x86. The app work. Create that emulator and test. If the app crash let me know what variable in the conditional is not correct (requestCode or resultCode) and the value of it. Let me know to specification of your emulator and program that you use to emulate.
I Use Api22 Emulator .... and my phone Galaxay A5 2017 android 6.0.1 app no crash only show "Permission no Available"
app for you work mean : screen recorder icon on notification bar show ?
for me only show Permission no Available an no add icon in notficationBar
I recompile build clean first and build apk in android studio now app no crash an no show Permission no Available But Not do any thing i wating 5 minutes no do any thing no connect to rtsp server....
Can you see toast with text Connection success or other toast? Post me log please. I need device model that you emulate and arquitecture to reproduce the error. Anyway android can't encode with hardware in emulator below api 23 or 24, I'm not sure so It will crash.
api 23 or 24 only can use your library ? no cant see any Toast and Connection Status ... http://www.mediafire.com/convkey/dc99/vd9wljs6dlwnkc7zg.jpg DO NOT ANY THING NO ERROR IN LOGCAT ... no any error in log but no do anything and no any toast show ... Tested in android 6.0.1 .... I Tested in google nexus android 6.0 and do nothinggggg
This Emulator Setting http://www.mediafire.com/convkey/899a/0b0qdyq4oj7wipxzg.jpg app no crash only Do not anything
Even NOT SHOW AUTH ERROR .....
Please Help Me SendSurfaceView to server with your library because record display not work :( How Encoder of SurfaceView and send Stream To server ? thx I need both ways : rtspdisplay and Encoder SurfaceView And send to server
I do not have much Time 2 days Later Present My Project
Do you have your wowza with credential requirement? Post me logcat.
Try stream with ffmpeg using a pc. Command:
ffmpeg -i pathtomp4file -f rtsp rtsp:/ip:port/appname/streamname
yes wowza need credential ... ok test with pc . and send logcat . wait 5 minutes .
If you need credential use the method to set credentials....
Why no show me Auth Error ? ok now Test With setAuth method to set user and pass ...
in your library you use callback for check Result of Connection and show Toast Message To User Why no Show AuthError to me ?
That should show you message. It is working for me. Try with ffmpeg please and tell me the result
NO SHOW ANYTHING FOR ME ... Same On Phone And Emulator ... wait send logcat to you ..
i recode and add setauth.. method to my code no change anything http://www.mediafire.com/convkey/ad3d/8oz95773jaaon7tzg.jpg i Tested My Wowza Server With FFMPEG And Use Player to view myStream screen Shot of my wowza play Stream : http://www.mediafire.com/convkey/fa20/5qc8r5y23y28j6vzg.jpg and this my AndroiMonitor LOGCAT https://www.mediafire.com/file/6zns7z6o9i1bvcl/Latest%20LOGCAT.txt WITHOUT ANY ERROR BUT NO SHOW ANY TOAST AND NO DO ANYTHING
Other modification using activity that worked, remember change url rtsp in the code and auth if you need, Don't use a emulator: DisplayBase class:
package com.pedro.rtplibrary.base;
import android.content.Context;
import android.graphics.ImageFormat;
import android.media.MediaCodec;
import android.media.MediaFormat;
import android.media.MediaMuxer;
import android.os.Build;
import android.support.annotation.RequiresApi;
import android.view.Surface;
import android.view.SurfaceView;
import com.pedro.encoder.audio.AudioEncoder;
import com.pedro.encoder.audio.GetAacData;
import com.pedro.encoder.input.audio.GetMicrophoneData;
import com.pedro.encoder.input.audio.MicrophoneManager;
import com.pedro.encoder.input.video.GetCameraData;
import com.pedro.encoder.video.FormatVideoEncoder;
import com.pedro.encoder.video.GetH264Data;
import com.pedro.encoder.video.VideoEncoder;
import java.io.IOException;
import java.nio.ByteBuffer;
/**
* Created by pedro on 9/08/17.
*/
@RequiresApi(api = Build.VERSION_CODES.LOLLIPOP)
public abstract class DisplayBase
implements GetAacData, GetCameraData, GetH264Data, GetMicrophoneData {
protected Context context;
protected VideoEncoder videoEncoder;
protected MicrophoneManager microphoneManager;
protected AudioEncoder audioEncoder;
private boolean streaming;
protected SurfaceView surfaceView;
private boolean videoEnabled = true;
//record
private MediaMuxer mediaMuxer;
private int videoTrack = -1;
private int audioTrack = -1;
private boolean recording = false;
private boolean canRecord = false;
private MediaFormat videoFormat;
private MediaFormat audioFormat;
private int dpi = 320;
public DisplayBase(Context context) {
this.context = context;
this.surfaceView = null;
videoEncoder = new VideoEncoder(this);
microphoneManager = new MicrophoneManager(this);
audioEncoder = new AudioEncoder(this);
streaming = false;
}
public abstract void setAuthorization(String user, String password);
public boolean prepareVideo(int width, int height, int fps, int bitrate, boolean hardwareRotation,
int rotation, int dpi) {
this.dpi = dpi;
int imageFormat = ImageFormat.NV21; //supported nv21 and yv12
videoEncoder.setImageFormat(imageFormat);
boolean result =
videoEncoder.prepareVideoEncoder(width, height, fps, bitrate, rotation, hardwareRotation,
FormatVideoEncoder.SURFACE);
return result;
}
protected abstract void prepareAudioRtp(boolean isStereo, int sampleRate);
public boolean prepareAudio(int bitrate, int sampleRate, boolean isStereo, boolean echoCanceler,
boolean noiseSuppressor) {
microphoneManager.createMicrophone(sampleRate, isStereo, echoCanceler, noiseSuppressor);
prepareAudioRtp(isStereo, sampleRate);
return audioEncoder.prepareAudioEncoder(bitrate, sampleRate, isStereo);
}
public boolean prepareVideo() {
return videoEncoder.prepareVideoEncoder(640, 480, 30, 1200 * 1024, 0, true,
FormatVideoEncoder.SURFACE);
}
public abstract boolean prepareAudio();
/*Need be called while stream*/
public void startRecord(String path) throws IOException {
if (streaming) {
mediaMuxer = new MediaMuxer(path, MediaMuxer.OutputFormat.MUXER_OUTPUT_MPEG_4);
videoTrack = mediaMuxer.addTrack(videoFormat);
audioTrack = mediaMuxer.addTrack(audioFormat);
mediaMuxer.start();
recording = true;
} else {
throw new IOException("Need be called while stream");
}
}
public void stopRecord() {
recording = false;
canRecord = false;
if (mediaMuxer != null) {
mediaMuxer.stop();
mediaMuxer.release();
mediaMuxer = null;
}
videoTrack = -1;
audioTrack = -1;
}
protected abstract void startStreamRtp(String url);
public void startStream(String url) {
videoEncoder.start();
audioEncoder.start();
microphoneManager.start();
streaming = true;
startStreamRtp(url);
}
public int getStreamWidth() {
return videoEncoder.getWidth();
}
public int getStreamHeight() {
return videoEncoder.getHeight();
}
public Surface getStreamSurface() {
return videoEncoder.getInputSurface();
}
protected abstract void stopStreamRtp();
public void stopStream() {
microphoneManager.stop();
stopStreamRtp();
videoEncoder.stop();
audioEncoder.stop();
streaming = false;
}
public void disableAudio() {
microphoneManager.mute();
}
public void enableAudio() {
microphoneManager.unMute();
}
public boolean isAudioMuted() {
return microphoneManager.isMuted();
}
public boolean isVideoEnabled() {
return videoEnabled;
}
public void disableVideo() {
videoEncoder.startSendBlackImage();
videoEnabled = false;
}
public void enableVideo() {
videoEncoder.stopSendBlackImage();
videoEnabled = true;
}
/** need min API 19 */
public void setVideoBitrateOnFly(int bitrate) {
if (Build.VERSION.SDK_INT >= 19) {
videoEncoder.setVideoBitrateOnFly(bitrate);
}
}
public boolean isStreaming() {
return streaming;
}
protected abstract void getAacDataRtp(ByteBuffer aacBuffer, MediaCodec.BufferInfo info);
@Override
public void getAacData(ByteBuffer aacBuffer, MediaCodec.BufferInfo info) {
if (recording && canRecord) {
mediaMuxer.writeSampleData(audioTrack, aacBuffer, info);
}
getAacDataRtp(aacBuffer, info);
}
protected abstract void onSPSandPPSRtp(ByteBuffer sps, ByteBuffer pps);
@Override
public void onSPSandPPS(ByteBuffer sps, ByteBuffer pps) {
onSPSandPPSRtp(sps, pps);
}
protected abstract void getH264DataRtp(ByteBuffer h264Buffer, MediaCodec.BufferInfo info);
@Override
public void getH264Data(ByteBuffer h264Buffer, MediaCodec.BufferInfo info) {
if (recording) {
if (info.flags == MediaCodec.BUFFER_FLAG_KEY_FRAME) canRecord = true;
if (canRecord) {
mediaMuxer.writeSampleData(videoTrack, h264Buffer, info);
}
}
getH264DataRtp(h264Buffer, info);
}
@Override
public void inputPcmData(byte[] buffer, int size) {
audioEncoder.inputPcmData(buffer, size);
}
@Override
public void inputYv12Data(byte[] buffer) {
videoEncoder.inputYv12Data(buffer);
}
@Override
public void inputNv21Data(byte[] buffer) {
videoEncoder.inputNv21Data(buffer);
}
@Override
public void onVideoFormat(MediaFormat mediaFormat) {
videoFormat = mediaFormat;
}
@Override
public void onAudioFormat(MediaFormat mediaFormat) {
audioFormat = mediaFormat;
}
}
Main2Activity class:
package com.pedro.rtmpstreamer;
import android.app.Activity;
import android.content.Intent;
import android.media.projection.MediaProjection;
import android.media.projection.MediaProjectionManager;
import android.os.Bundle;
import android.support.v7.app.AppCompatActivity;
import android.util.Log;
import android.view.Surface;
import android.view.SurfaceHolder;
import android.view.SurfaceView;
import android.widget.Toast;
import com.pedro.rtplibrary.rtsp.RtspDisplay;
import com.pedro.rtsp.rtsp.Protocol;
import com.pedro.rtsp.utils.ConnectCheckerRtsp;
public class Main2Activity extends AppCompatActivity implements ConnectCheckerRtsp {
private MediaProjectionManager mediaProjectionManager;
private SurfaceView surfaceView;
private RtspDisplay rtspDisplay;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main2);
rtspDisplay = new RtspDisplay(this, Protocol.TCP, this);
mediaProjectionManager = (MediaProjectionManager) getSystemService(MEDIA_PROJECTION_SERVICE);
surfaceView = (SurfaceView) findViewById(R.id.sv);
surfaceView.getHolder().addCallback(new SurfaceHolder.Callback() {
@Override
public void surfaceCreated(SurfaceHolder surfaceHolder) {
startActivityForResult(mediaProjectionManager.createScreenCaptureIntent(), 179);
}
@Override
public void surfaceChanged(SurfaceHolder surfaceHolder, int i, int i1, int i2) {
}
@Override
public void surfaceDestroyed(SurfaceHolder surfaceHolder) {
}
});
}
@Override
public void onActivityResult(int requestCode, int resultCode, Intent data) {
if (requestCode == 179 && resultCode == Activity.RESULT_OK) {
if (rtspDisplay.prepareAudio(),rtspDisplay.prepareVideo()){
rtspDisplay.startStream("writeyourserverrtsp");
int width = rtspDisplay.getStreamWidth();
int height = rtspDisplay.getStreamHeight();
Log.e("Pedro", "size: " + width + "X" + height);
Surface surface = rtspDisplay.getStreamSurface();
Log.e("Pedro", "valid surface?? " + (surface != null));
MediaProjection mediaProjection =
mediaProjectionManager.getMediaProjection(resultCode, data);
mediaProjection.createVirtualDisplay("Stream Display", width,
height, 320, 0, surface, null, null);
}
} else {
Toast.makeText(this, "Permissions no available", Toast.LENGTH_SHORT).show();
}
}
@Override
public void onConnectionSuccessRtmp() {
runOnUiThread(new Runnable() {
@Override
public void run() {
Toast.makeText(Main2Activity.this, "Connection success", Toast.LENGTH_SHORT).show();
}
});
}
@Override
public void onConnectionFailedRtmp() {
runOnUiThread(new Runnable() {
@Override
public void run() {
Toast.makeText(Main2Activity.this, "Connection failed", Toast.LENGTH_SHORT).show();
rtspDisplay.stopStream();
}
});
}
@Override
public void onDisconnectRtmp() {
runOnUiThread(new Runnable() {
@Override
public void run() {
Toast.makeText(Main2Activity.this, "Disconnected", Toast.LENGTH_SHORT).show();
}
});
}
@Override
public void onAuthErrorRtmp() {
runOnUiThread(new Runnable() {
@Override
public void run() {
Toast.makeText(Main2Activity.this, "Auth error", Toast.LENGTH_SHORT).show();
}
});
}
@Override
public void onAuthSuccessRtmp() {
runOnUiThread(new Runnable() {
@Override
public void run() {
Toast.makeText(Main2Activity.this, "Auth success", Toast.LENGTH_SHORT).show();
}
});
}
}
Now Test And Send Result To you.
I alter DisplayBase Class Found Two error 1-And operator between sound and video in activity result 2-you use overide for rtmp but call rtsp utility finally solve error and run program i touch on START NOW app crashed log cat error https://www.mediafire.com/file/0e7b3dftma0degd/lcat.txt crashed app
Hi Thank you for create powerful library for streaming how can streaming to wowza server directly from surfaceview i created project : get video stream from hardware and show on surface view now need stream to server your library very good but : supported camera - file - display