sonroyaalmerol / m3u-stream-merger-proxy

A lightweight HTTP proxy server dockerized for consolidating and streaming content from multiple IPTV M3U playlists, acting as a load balancer between provided sources.
https://hub.docker.com/r/sonroyaalmerol/m3u-stream-merger-proxy
12 stars 1 forks source link

Using Development build - Encountered this error. #32

Open shorty789 opened 3 months ago

shorty789 commented 3 months ago

One thing that looks abnormal to me is the 4th line from the bottom seems much longer than all the others.

After encountering this issue, playback completely stopped until i deleted data and recreated the container.

2024/03/11 20:56:40 Error copying MP4 stream to response: read tcp :49528->:8080: use of closed network connection 2024/03/11 20:56:40 Closed connection for :43244 2024/03/11 20:57:07 Received request from :41782 for URL: /stream/VVNBIC0gSUdOIFRWIEhE.mp4 2024/03/11 20:57:07 Current concurrent connections for M3U_1: 1 2024/03/11 20:57:07 Concurrency limit reached (1): http:// 2024/03/11 20:57:07 Current concurrent connections for M3U_2: 0 2024/03/11 20:57:32 Received request from :43790 for URL: /stream/VVNBIC0gSUdOIFRWIEhE.mp4 2024/03/11 20:57:32 Current concurrent connections for M3U_1: 1 2024/03/11 20:57:32 Concurrency limit reached (1): http:// 2024/03/11 20:57:32 Current concurrent connections for M3U_2: 0 2024/03/11 20:57:37 Error fetching MP4 stream (concurrency check mode): Get "http:///offline.ts": dial tcp :80: i/o timeout 2024/03/11 20:57:37 Current concurrent connections for M3U_3: 0 2024/03/11 20:58:07 Error fetching MP4 stream (concurrency check mode): Get "http:///offline.ts": dial tcp :80: i/o timeout 2024/03/11 20:58:07 Current concurrent connections for M3U_4: 0 2024/03/11 20:58:08 Proxying :43790 to http:// 2024/03/11 20:58:08 Sent MP4 stream to :43790 2024/03/11 20:58:08 Current concurrent connections for M3U_2: 1 2024/03/11 20:58:08 Client disconnected after fetching MP4 stream 2024/03/11 20:58:08 Error copying MP4 stream to response: write tcp :8080->:43790: write: connection reset by peer 2024/03/11 20:58:08 Closed connection for :43790 2024/03/11 20:58:08 Current concurrent connections for M3U_2: 0 2024/03/11 20:58:33 Received request from :56386 for URL: /stream/RCsgKFVLKSBFdmVudHMgNTk6IEJpYXRobG9uIHwgU29sZGllciBIb2xsb3cgfCBXb21lbmBzIFB1cnN1aXQgfCBXb3JsZCBDdXAgfCBTdW4gMTAgTWFyIDE2OjQ1.mp4 2024/03/11 20:58:33 Current concurrent connections for M3U_1: 1 2024/03/11 20:58:33 Concurrency limit reached (1): http:// 2024/03/11 20:58:33 Current concurrent connections for M3U_4: 0

sonroyaalmerol commented 3 months ago

The long mp4 string suggests a parsing error. Does the problem persist even after recreating the container? Does it only happen on specific channels?

shorty789 commented 3 months ago

It seems ok again since deleting the data and recreating the container. It had been running since saturday and at some points had streamed smoothly for 8+ hours, There hasnt been a specific pattern regarding channels either.

sonroyaalmerol commented 3 months ago

I can't seem to recreate this issue. I'll leave this one open and let me know if this happens again. I can't really fix an issue without knowing what exactly is triggering it.

shorty789 commented 3 months ago

No problem I will do, when it happened I was watching a stream no problem at all, then it went to a black screen, it was after attempting to load the channel back up that it tried to load the long string.

I have just encountered another problem that I am planning to try and capture a bit better after a rebuild and further testing, I do not know if they may be related, but I was watching a rather choppy stream, it kept disconnecting, after a while I noticed the same stream was apparently occupying 3/4 of my playlists. I think the concurrency counter doesnt necessarily clear up after disconnecting in such a manner, the client ends up just trying to connect again and ends up moving over to a different playlist. I will let you know when I have some more info.

sonroyaalmerol commented 3 months ago

I added a BUFFER_MB env var in both dev and latest release. You may try and play around with that for choppy streams. I would suggest using low buffer sizes (maybe 1 or 2).

sonroyaalmerol commented 3 months ago

I might've found the issue. I'm guessing both issues might have something to do with connections never timing out even if it's not receiving any response/data. Apparently, this is the default behavior of Go's HTTP client.

I've specified a timeout for the HTTP connections and should disconnect and decrease the concurrency counter as soon as that time has been reached. The fix should be on the dev tag.

Let me know if the problem persists.

shorty789 commented 3 months ago

Seems like its just throwing me off after 10 seconds at the moment

shorty789 commented 3 months ago

Would it be worth attempting to reconnect when it fails initially to try and keep the stream going?

sonroyaalmerol commented 3 months ago

Seems like its just throwing me off after 10 seconds at the moment

My bad. It seems like it forcibly stops long-lived connections this way. I'm reverting it for now until I find a better solution. Should be deployed back to dev now.

The buffer feature might help. You can try that for now.

sonroyaalmerol commented 3 months ago

I've implemented a very experimental fix in the :dev build.

Let me know how it goes. In case the problem gets worse, you can revert back to the :latest build.