mhdzumair / MediaFusion

Universal Add-On for Stremio & Kodi
https://mhdzumair.github.io/MediaFusion/
MIT License
349 stars 41 forks source link

Public Instance: Duplicate Torrents being added in RealDebrid #243

Closed jishnurk closed 1 month ago

jishnurk commented 4 months ago

Opening a stream leads to duplicate torrents (same torrent) being added in RealDebrid.

Issue occurs only in Public/Elfhosted instance.

jishnurk commented 4 months ago

@mhdzumair Is it possible for you to check the logs? Since it is not happening in local instance, I am unable to debug.

mhdzumair commented 4 months ago

This issue was handled sometimes back with caching and locking the streaming_provider_endpoint function. are you facing this issue?

jishnurk commented 4 months ago

Yes, I am. Attached a screenshot:

Screenshot 2024-07-20 153333

mhdzumair commented 4 months ago

Stremio sending two request initially,

  1. From the player.
  2. From Stremio server to calculate video OpenSubtitle hash.

Both request are firing nearly simultaneously. We need to ensure the redis locking is working correctly. Apart from that, Public Elfhosted has 10 pods, so both request might be handling in two different pods simultaneously. However, we use 1 single redis.

@coderabbitai would you chime into find a solution or fixes? The streaming provider codes are in streaming_providers/routes.py

coderabbitai[bot] commented 4 months ago

[!TIP] For best results, initiate chat on the files or code changes.

@mhdzumair, I've reviewed the streaming_provider_endpoint function. The redis locking mechanism is in place and uses acquire_redis_lock and release_redis_lock functions. However, to ensure it handles simultaneous requests from different pods correctly, consider the following improvements:

  1. Increase Lock Timeout: Ensure the lock acquisition timeout is sufficient to handle simultaneous requests. You can adjust the timeout parameter in the acquire_redis_lock function.

  2. Unique Lock Key: Verify that the lock key (cached_stream_url_key_locked) is unique enough to prevent collisions. It currently includes the user IP, secret string, info hash, season, and episode, which should be unique for each request.

  3. Double-Check Lock Release: Ensure that the lock is always released, even if an exception occurs. This is already handled in the try...except block, but double-check for any edge cases.

  4. Redis Configuration: Ensure that the Redis instance is correctly configured to handle high concurrency and multiple connections from different pods.

If the issue persists, you might want to add more detailed logging around the lock acquisition and release to diagnose any potential race conditions or timeouts.

Let me know if you need further assistance!

Analysis chain --- ---
Scripts executed The following scripts were executed for the analysis: Script: ```shell #!/bin/bash # Locate the `streaming_provider_endpoint` function in `streaming_providers/routes.py` and review its redis locking mechanism. ast-grep --lang python --pattern $'def streaming_provider_endpoint($_, $_):\n $$$' streaming_providers/routes.py ``` Length of output: 115 --- Script: ```shell #!/bin/bash # Use `rg` to search for the `streaming_provider_endpoint` function in the `streaming_providers/routes.py` file. rg 'def streaming_provider_endpoint' -A 20 streaming_providers/routes.py ``` Length of output: 753 --- Script: ```shell #!/bin/bash # Search for redis-related operations within the `streaming_provider_endpoint` function in the `streaming_providers/routes.py` file. rg 'redis' -A 10 -B 10 streaming_providers/routes.py ``` Length of output: 5313

Learnings added ``` Learnt from: mhdzumair URL: https://github.com/mhdzumair/MediaFusion/issues/243 Timestamp: 2024-07-20T13:59:01.902Z Learning: The `streaming_provider_endpoint` function in `streaming_providers/routes.py` includes redis operations for caching URLs and acquiring/releasing redis locks. The lock is acquired using the `acquire_redis_lock` function and released using the `release_redis_lock` function. ``` > Note: Learnings are effective only in the context of similar code segments. To apply general coding guidelines, please configure [review instructions](https://docs.coderabbit.ai/guides/review-instructions). You can manage existing learnings in the [app](https://app.coderabbit.ai/learnings).
mhdzumair commented 2 months ago

The issue has been identified and resolved in the latest code. So the issue is due to the info hash stored in the db is in caps letters for prowlarr data. Then It's failing to match the info hash, therefor it's adding multiple times.

we need to do the cleanup on torrent data in elfhosted. until then ill keep open this.

jishnurk commented 2 months ago

OK, cool! When do you plan to release the new version? Do you also plan to reset the database (at least posters)?

mhdzumair commented 2 months ago

The release is probably set for this weekend. All the changes on my wishlist are almost complete, though a few more tasks need to be finished on the live TV side. As for the poster, it’s stored in Redis, so cleaning up Redis will generate new posters.

Additionally, I announced the Alldebrid voucher giveaway for contributors. Please post your email, so I can send you the voucher code as a thank you for your excellent contributions.