Closed DanielnetoDotCom closed 3 years ago
I need more information to figure this out.
Thanks for your support, I am making a lot of tests let me give you more details
If I use the WebRTC player from a desktop browser it works perfect, but I am getting the same result from mobile,
Tested on Chrome and Safari iOS and Chrome on Android
Are you trying to stream over WebRTC from iPhone?
Yes also I have tried from Android and I got the same result
Tested from https://demo.ovenplayer.com/demo_input.html?
No, I got the ovenplayer and I am hosting myself
Are you trying to get input over WebRTC and push to another server with RTMP?
Correct
What does crash mean? Is the server dead? Or is playback interrupted?
Only the RTMPPush. when I push it connects on the Nginx server, transmit for about 5 seconds then disconnect, I am pretty sure is something on the OME ffmpeg command
If the server dies, compile in debug mode and reproduce the crash, and then provide us with a crash dump to solve the problem quickly. (It will exist as crash_20210721.dump in the directory where the ome binary is located.)
Does not dies, only the RTMPPush is interrupted
What system goes into iPhone -> OME -> (???)?? And if you play directly from ome over hls or webrtc does it work?
Yes, it does work
What version of OME are you using?
Docker latest
Upload full log. If you truncate the log and upload it, I can't figure out the problem.
The truncated log is on this query https://github.com/AirenSoft/OvenMediaEngine/issues/446#issue-949033267
Upload Server.xml.
<?xml version="1.0" encoding="UTF-8"?>
<Server version="8">
<Name>OvenMediaEngine</Name>
<!-- Host type (origin/edge) -->
<Type>origin</Type>
<!-- Specify IP address to bind (* means all IPs) -->
<IP>*</IP>
<!--
To get the public IP address(mapped address of stun) of the local server.
This is useful when OME cannot obtain a public IP from an interface, such as AWS or docker environment.
If this is successful, you can use ${PublicIP} in your settings.
-->
<StunServer>stun.l.google.com:19302</StunServer>
<!-- Settings for the ports to bind -->
<Bind>
<!-- Enable this configuration if you want to use API Server -->
<Managers>
<API>
<Port>${env:OME_API_PORT:48081}</Port>
<WorkerCount>1</WorkerCount>
</API>
</Managers>
<Providers>
<RTMP>
<Port>${env:OME_RTMP_PROV_PORT:41935}</Port>
</RTMP>
<SRT>
<Port>${env:OME_SRT_PROV_PORT:9999}</Port>
</SRT>
<MPEGTS>
<Port>${env:OME_MPEGTS_PROV_PORT:4000-4003,4004,4005/udp}</Port>
</MPEGTS>
<WebRTC>
<Signalling>
<Port>${env:OME_SIGNALLING_PORT:3333}</Port>
<!-- If you want to use TLS, specify the TLS port -->
<TLSPort>${env:OME_SOCKET_PORT:3334}</TLSPort>
</Signalling>
<IceCandidates>
<TcpRelay>${env:OME_TCP_RELAY_ADDRESS:*:3478}</TcpRelay>
<IceCandidate>${env:OME_ICE_CANDIDATES:*:10006-10010/udp}</IceCandidate>
</IceCandidates>
</WebRTC>
</Providers>
<Publishers>
<!-- The OVT is protocol for ORIGIN-EDGE -->
<OVT>
<Port>${env:OME_ORIGIN_PORT:9000}</Port>
</OVT>
<HLS>
<Port>${env:OME_HLS_STREAM_PORT:8080}</Port>
<!-- If you want to use TLS, specify the TLS port -->
<TLSPort>${env:OME_STREAM_PORT_TLS:8443}</TLSPort>
</HLS>
<DASH>
<Port>${env:OME_DASH_STREAM_PORT:8080}</Port>
<!-- If you want to use TLS, specify the TLS port -->
<TLSPort>${env:OME_STREAM_PORT_TLS:8443}</TLSPort>
</DASH>
<WebRTC>
<Signalling>
<Port>${env:OME_SIGNALLING_PORT:3333}</Port>
<!-- If you want to use TLS, specify the TLS port -->
<TLSPort>${env:OME_SOCKET_PORT:3334}</TLSPort>
</Signalling>
<IceCandidates>
<TcpRelay>${env:OME_TCP_RELAY_ADDRESS:*:3478}</TcpRelay>
<IceCandidate>${env:OME_ICE_CANDIDATES:*:10006-10010/udp}</IceCandidate>
</IceCandidates>
</WebRTC>
</Publishers>
</Bind>
<!-- P2P works only in WebRTC -->
<!--
<P2P>
<MaxClientPeersPerHostPeer>2</MaxClientPeersPerHostPeer>
</P2P>
-->
<!--
Enable this configuration if you want to use API Server
<AccessToken> is a token for authentication, and when you invoke the API, you must put "Basic base64encode(<AccessToken>)" in the "Authorization" header of HTTP request.
For example, if you set <AccessToken> to "ome-access-token", you must set "Basic b21lLWFjY2Vzcy10b2tlbg==" in the "Authorization" header.
-->
<!--
<Managers>
<Host>
<Names>
<Name>*</Name>
</Names>
<TLS>
<CertPath>path/to/file.crt</CertPath>
<KeyPath>path/to/file.key</KeyPath>
<ChainCertPath>path/to/file.crt</ChainCertPath>
</TLS>
</Host>
<API>
<AccessToken>ome-access-token</AccessToken>
</API>
</Managers>
-->
<VirtualHosts>
<!-- You can use wildcard like this to include multiple XMLs -->
<VirtualHost include="VHost*.xml" />
<VirtualHost>
<Name>default</Name>
<!-- Settings for multi ip/domain and TLS -->
<Host>
<Names>
<Name>{ServerHost}</Name>
</Names>
<TLS>
<CertPath>/cert/CertPath.pem</CertPath>
<KeyPath>/cert/KeyPath.pem</KeyPath>
<ChainCertPath>/cert/ChainCertPath.pem</ChainCertPath>
</TLS>
</Host>
<!--
Refer https://airensoft.gitbook.io/ovenmediaengine/signedpolicy
<SignedPolicy>
<PolicyQueryKeyName>policy</PolicyQueryKeyName>
<SignatureQueryKeyName>signature</SignatureQueryKeyName>
<SecretKey>aKq#1kj</SecretKey>
<Enables>
<Providers>rtmp,webrtc,srt</Providers>
<Publishers>webrtc,hls,dash,lldash</Publishers>
</Enables>
</SignedPolicy>
-->
<!--
<AdmissionWebhooks>
<TargetUrl></TargetUrl>
<SeretKey></SecretKey>
<Timeout>3000</Timeout>
<Enables>
<Providers>rtmp,webrtc,srt</Providers>
<Publishers>webrtc,hls,dash,lldash</Publishers>
</Enables>
</AdmissionWebhooks>
-->
<!--
<Origins>
<Origin>
<Location>/app/stream</Location>
<Pass>
<Scheme>ovt</Scheme>
<Urls><Url>origin.com:9000/app/stream_720p</Url></Urls>
</Pass>
</Origin>
<Origin>
<Location>/app/</Location>
<Pass>
<Scheme>ovt</Scheme>
<Urls><Url>origin.com:9000/app/</Url></Urls>
</Pass>
</Origin>
<Origin>
<Location>/edge/</Location>
<Pass>
<Scheme>ovt</Scheme>
<Urls><Url>origin.com:9000/app/</Url></Urls>
</Pass>
</Origin>
</Origins>
-->
<!-- Settings for applications -->
<Applications>
<Application>
<Name>app</Name>
<!-- Application type (live/vod) -->
<Type>live</Type>
<OutputProfiles>
<!-- Enable this configuration if you want to hardware acceleration using GPU -->
<!--
<HardwareAcceleration>false</HardwareAcceleration>
-->
<OutputProfile>
<Name>bypass_stream</Name>
<OutputStreamName>${OriginStreamName}</OutputStreamName>
<Encodes>
<Audio>
<Bypass>false</Bypass>
<Codec>aac</Codec>
<Bitrate>128000</Bitrate>
<Samplerate>48000</Samplerate>
<Channel>2</Channel>
</Audio>
<Video>
<Bypass>true</Bypass>
<Codec>vp8</Codec>
<Width>1280</Width>
<Height>720</Height>
<Bitrate>2000000</Bitrate>
<Framerate>30.0</Framerate>
</Video>
<Audio>
<Codec>opus</Codec>
<Bitrate>128000</Bitrate>
<Samplerate>48000</Samplerate>
<Channel>2</Channel>
</Audio>
<!--
<Video>
<Codec>vp8</Codec>
<Bitrate>1024000</Bitrate>
<Framerate>30</Framerate>
<Width>1280</Width>
<Height>720</Height>
</Video>
-->
</Encodes>
</OutputProfile>
</OutputProfiles>
<Providers>
<OVT />
<WebRTC />
<RTMP />
<SRT />
<MPEGTS>
<StreamMap>
<!--
Set the stream name of the client connected to the port to "stream_${Port}"
For example, if a client connets to port 4000, OME creates a "stream_4000" stream
-->
<Stream>
<Name>stream_${Port}</Name>
<Port>4000,4001-4004</Port>
</Stream>
<Stream>
<Name>stream_4005</Name>
<Port>4005</Port>
</Stream>
</StreamMap>
</MPEGTS>
<RTSPPull />
<WebRTC>
<Timeout>30000</Timeout>
</WebRTC>
</Providers>
<Publishers>
<SessionLoadBalancingThreadCount>8</SessionLoadBalancingThreadCount>
<OVT />
<WebRTC>
<Timeout>30000</Timeout>
<Rtx>true</Rtx>
<Ulpfec>true</Ulpfec>
</WebRTC>
<HLS>
<SegmentDuration>5</SegmentDuration>
<SegmentCount>3</SegmentCount>
<CrossDomains>
<Url>*</Url>
</CrossDomains>
</HLS>
<DASH>
<SegmentDuration>5</SegmentDuration>
<SegmentCount>3</SegmentCount>
<CrossDomains>
<Url>*</Url>
</CrossDomains>
<!--
Enable DASH player to obtain UTCTiming from OME using /time?iso&ms API
-->
<UTCTiming>
<Scheme>urn:mpeg:dash:utc:http-xsdate:2014</Scheme>
<Value>/time?iso&ms</Value>
</UTCTiming>
</DASH>
<LLDASH>
<SegmentDuration>5</SegmentDuration>
<CrossDomains>
<Url>*</Url>
</CrossDomains>
<!--
Use default options for UTCTiming
- scheme: urn:mpeg:dash:utc:http-xsdate:2014
- value: /time?iso&ms
-->
<UTCTiming />
</LLDASH>
<RTMPPush></RTMPPush>
</Publishers>
</Application>
</Applications>
</VirtualHost>
</VirtualHosts>
<Managers>
<Host>
<Names>
<Name>*</Name>
</Names>
<!--
If you want to set up TLS, set it up by referring to the following:
<TLS>
<CertPath>airensoft_com.crt</CertPath>
<KeyPath>airensoft_com.key</KeyPath>
<ChainCertPath>airensoft_com_chain.crt</ChainCertPath>
</TLS>
-->
</Host>
<API>
<AccessToken>{AccessToken}</AccessToken>
</API>
</Managers>
</Server>
Is the server's performance or network performance fast enough?
Yes, 500 mbps server
Another information, WebRTC from the PC the audio is awesome, but WebRTC from mobile the audio is unlistenable, sounds like a robot
I am not sure if there is something I can do about it
This may be due to the network speed of your mobile.
Here is a demo service that allows you to test webrtc broadcast and playback at the same time. This is an aws instance installed in Seoul region. This works great for iPhone, Android and PC. Test it here and let us know the results.
Thanks for your response @getroot
WebRTC ingests and WebRTC playing works fine.
the problem is when I use WebRTC ingest and consume the video using HLS or pullRTMP, the audio is not good when I use the mobile
do you have an HLS ling from your page so I can test it?
getroot@airensoft.com
I'll test the WebRTC input/HLS output with the same settings as yours.
I am preparing a server that you can use just in case
I just sent you a email with credentials that you can use to test fell free to install /uninstall anything you want
Unbelievable, I just set up a new server for you and everything is working as expected. the old server is even a better server than this one (more CPU and more bandwidth and SSD drives), But the new server has only OME installed, maybe has something to do with the other apps I do not know what to say. I will make more tests, and thanks for your support.
If so, it is possible that the aac codec is installed incorrectly. Anyway, congratulations. I wish your project success!
If you send us your use case after the project is complete (or when you're not busy), and you allow us to write an article on it on our blog, that's a big contribution to us.
Hi,
Good morning and thanks for your support
I am using docker, how can I make sure the aac codec is installed correctly? should I check this on the host server?
regarding the use case, it will be a pleasure. For your information, I am adapting your server to use on this project: https://github.com/WWBN/AVideo
it is 90% complete I will use your server to feed our live stream service. I have thousands of websites that will run your server.
regarding the adaptation, I will make some questions (I can open one issue for each if you prefer)
I am currently using a restream tool I develop, I had some problems with the pushRTMP and you said it was on beta. but this was on the old server, I am not sure if it will happen on the new server, but the restream tool seems to be more stable but I guess it will increase the latency. so the pushRTMP still an option for me I just need a way to check if the pushRTMP is active or not, the API do not provide that information. if I list the push it will show active but it is not. maybe if I can detect a problem on the pushRTMP I can fall back on the restream feature.
Currently, I do not need all the Providers and Publishers, I guess if I disable them on the Server.xml it will save some resources, so my idea is ...
for provider, I will only need/enable WebRTC I will also need to use only the TLS port (3334), if I use only the TLS do I need the regular port open (3333)? or can I disable that too?
for a publisher, I need something that I can read from an external FFMPEG, so I guess I can use HLS or DASH or LLDASH. From those what is the fastest/best?
I am not sure if FFmpeg will be able to read the WebRTC socket, then restream it, do you have any idea if it is possible?
I have a good environment for testing, and it will be a pleasure to give you access to it if you want.; the environment has the AVideo installation + Nginx RTMP servers + OME
again congratulations, your project is awesome!
I am using docker, how can I make sure the aac codec is installed correctly? should I check this on the host server?
I'm not sure about this either. At least I've never experienced this. OME is using FDKAAC_VERSION=0.1.5.
I had some problems with the pushRTMP
If there is a problem with rtmp push on a new server, please create a new issue.
I will also need to use only the TLS port (3334), if I use only the TLS do I need the regular port open (3333)? or can I disable that too?
You can close the regular port 3333.
From those what is the fastest/best?
HLS is the most reliable. FFMPEG cannot read WebRTC in OME.
I have a good environment for testing, and it will be a pleasure to give you access to it if you want.; the environment has the AVideo installation + Nginx RTMP servers + OME
Thank you for providing a test environment. I'll ask you later when I need it.
This issue has been closed since it has been inactive for quite some time. If you want to continue discussing this issue, please feel free to reopen it.
Hi,
whatever configuration I try the WebRTC from iPhone is unstable (Not tested on Android yet)
the live works for few seconds then crash.
here are my logs, I guess has something to do with
Application provided invalid, non monotonically increasing dts to muxer in stream
, any help I appreciate.