Open AlexTryHarder opened 1 month ago
I'd have to think about it. I see a risk that for rare downloads we may delete downloads that depend on a given user coming online.
Can you paste a screenshot where in qBit you see the peers that have 100% (or 99.8% for that matter). If you can also find out which API call we can use to get that, that'd be helpful (check your network tab when you open the respective page)
pls also post the respective response for the API
I'm not very good at API's but I hope this the API call should work.
In screenshot is jus a example of few series, but after downloading over 5TB, I can confirm it's majority of torrents like that. It's just a wate of bandwidth and time to download something that most likely never be available.
Open the network tab in your browser and refresh the page. Youll see all requests and then pls paste the response of the relevant one
{ "peers": { "107.150.30.225:24604": { "dl_speed": 20486, "downloaded": 40651918 } }, "rid": 8 }
also returns that: {"peers":{"107.150.30.225:24604":{"downloaded":40697856,"files":""}},"rid":10}
Another example { "peers": { "102.129.235.28:46817": { "client": "qBittorrent 4.5.5", "connection": "BT", "country": "United States", "country_code": "us", "dl_speed": 0, "downloaded": 0, "files": "", "flags": "? I H X", "flags_desc": "? = Not interested (peer) and unchoked (local)\nI = Incoming connection\nH = Peer from DHT\nX = Peer from PEX", "ip": "102.129.235.28", "peer_id_client": "-qB4550-", "port": 46817, "progress": 0.40771591663360596, "relevance": 0, "up_speed": 0, "uploaded": 0 }, "107.150.30.225:24604": { "client": "qBittorrent/4.4.5", "connection": "μTP", "country": "United States", "country_code": "us", "dl_speed": 0, "downloaded": 294912, "files": "", "flags": "H X P", "flags_desc": "H = Peer from DHT\nX = Peer from PEX\nP = μTP", "ip": "107.150.30.225", "peer_id_client": "-qB4450-", "port": 24604, "progress": 0.40771591663360596, "relevance": 0, "up_speed": 0, "uploaded": 0 }, "146.70.27.250:18455": { "client": "qBittorrent/4.4.3.1", "connection": "μTP", "country": "Canada", "country_code": "ca", "dl_speed": 0, "downloaded": 36177198, "files": "", "flags": "H X P", "flags_desc": "H = Peer from DHT\nX = Peer from PEX\nP = μTP", "ip": "146.70.27.250", "peer_id_client": "-qB4431-", "port": 18455, "progress": 0.40771591663360596, "relevance": 0, "up_speed": 0, "uploaded": 0 }, "149.102.254.3:50321": { "client": "qBittorrent/4.6.5", "connection": "BT", "country": "United States", "country_code": "us", "dl_speed": 0, "downloaded": 0, "files": "", "flags": "? I X", "flags_desc": "? = Not interested (peer) and unchoked (local)\nI = Incoming connection\nX = Peer from PEX", "ip": "149.102.254.3", "peer_id_client": "-qB4650-", "port": 50321, "progress": 0.40771591663360596, "relevance": 0, "up_speed": 0, "uploaded": 0 }, "149.40.50.108:30657": { "client": "Deluge/2.1.1 libtorrent/2.0.10.0", "connection": "μTP", "country": "United States", "country_code": "us", "dl_speed": 0, "downloaded": 59375616, "files": "", "flags": "H X P", "flags_desc": "H = Peer from DHT\nX = Peer from PEX\nP = μTP", "ip": "149.40.50.108", "peer_id_client": "-DE211s-", "port": 30657, "progress": 0.40771591663360596, "relevance": 0, "up_speed": 0, "uploaded": 0 }, "174.197.3.121:7797": { "client": "Deluge/2.1.1 libtorrent/2.0.5.0", "connection": "μTP", "country": "United States", "country_code": "us", "dl_speed": 0, "downloaded": 23904256, "files": "", "flags": "H X P", "flags_desc": "H = Peer from DHT\nX = Peer from PEX\nP = μTP", "ip": "174.197.3.121", "peer_id_client": "-DE211s-", "port": 7797, "progress": 0.40771591663360596, "relevance": 0, "up_speed": 0, "uploaded": 0 }, "5.182.32.12:6881": { "client": "qBittorrent 4.6.5", "connection": "μTP", "country": "United States", "country_code": "us", "dl_speed": 0, "downloaded": 2031616, "files": "", "flags": "H X P", "flags_desc": "H = Peer from DHT\nX = Peer from PEX\nP = μTP", "ip": "5.182.32.12", "peer_id_client": "-qB4650-", "port": 6881, "progress": 0.40771591663360596, "relevance": 0, "up_speed": 0, "uploaded": 0 }, "91.193.6.170:56781": { "client": "Deluge/2.1.1 libtorrent/2.0.10.0", "connection": "μTP", "country": "Canada", "country_code": "ca", "dl_speed": 0, "downloaded": 98304, "files": "", "flags": "H X P", "flags_desc": "H = Peer from DHT\nX = Peer from PEX\nP = μTP", "ip": "91.193.6.170", "peer_id_client": "-DE211s-", "port": 56781, "progress": 0.40771591663360596, "relevance": 0, "up_speed": 0, "uploaded": 0 } }, "peers_removed": [ "84.247.105.248:20859" ], "rid": 255 }
That‘s from the /info response?
Request URL looks like that: http://10.60.60.138:8080/api/v2/sync/maindata?rid=8601&lztkkpg5 if that's what you mean.
I'm not very good at API's but I hope this the API call should work.
In screenshot is jus a example of few series, but after downloading over 5TB, I can confirm it's majority of torrents like that. It's just a wate of bandwidth and time to download something that most likely never be available.
Stupid Question: they are all ‚Stalled‘. Why does the stall check of decluttarr not pick them up?
yeah, Im not sure. Decluttarr is picking some of them up (can see them in logs). eg. torrent added yesterday 14/08/2024. And it's set to check for stalled every 5 min (3 tries) but it's still there today.
But that's a separate issue I guess
one of interesting log entry: 2024-08-15T07:20:21.958413922Z [INFO]: >>> Detected stalled download too many times (25 out of 3 permitted times): The.First.48.S17E08.Cruel.Summer.HDTV.x264-CRiMSON[eztv]
Check the FAQ
ok, that would fix that issue. How about my initial question of not downloading when there is not at least 1 peer with 100%?..
Is there any that is not also stalled? If so, can you pls provide screenshots/api responses for those specific ones?
Ok one more example of torrent with is NOT yet stalled, but as peer has only 96.3% it will get stalled soon.
http://10.60.60.138:8080/api/v2/sync/maindata?rid=44392&lztkli7v { "rid": 44393, "server_state": { "alltime_dl": 17595000087675, "alltime_ul": 24082722020848, "dl_info_data": 1020875438429, "dl_info_speed": 1313488, "total_peer_connections": 10 }, "torrents": { "040c95a65cd6fc40e128a06be287456984770bb4": { "num_complete": 1, "num_incomplete": 2, "state": "metaDL", "time_active": 1 }, "4a9be90cde0d555d81f7c95d144008d324a391dc": { "amount_left": 6535118848, "completed": 20372336874, "dlspeed": 1296527, "downloaded": 20371484714, "downloaded_session": 20372664254, "eta": 6408, "progress": 0.7571260948816958, "time_active": 39405 }, "770c6de3a744dd359b37b083c8031be999623498": { "time_active": 443 }, "a6ef30dfeb1ef53d5d7222a9d5e490af903a0f6a": { "time_active": 240 }, "b6aa50132ac06aeef4a851801e7e7eb0f51a0c64": { "time_active": 2 }, "f3e691ae16e7c5d1d394707bc4464eba10a873e7": { "amount_left": 362266624, "completed": 535067664, "dlspeed": 3633, "downloaded": 535147414, "downloaded_session": 535147414, "eta": 105478, "progress": 0.5962857668044442, "time_active": 88287 } } }
http://10.60.60.138:8080/api/v2/sync/torrentPeers?rid=294&hash=4a9be90cde0d555d81f7c95d144008d324a391dc&lztkli9y { "peers": { "67.193.109.165:6881": { "dl_speed": 924827, "downloaded": 18251529053 } }, "rid": 295 }
If it gets stalled it will be caught. So don‘t see the need for an additional check?
If it's known it will be stalled from beginning, wouldn't it be better to blacklist torrent not to waste bandwidth and time? Some torrent's are stalled after downloading +30gb
Another possible approach would be to set missing files to "not download", as I noticed that while downloading some seasons only some files are missing causing whole torrent to stall.
Example .py:
torrent_hash = 'your_torrent_hash' # Replace with your torrent hash
session = requests.Session()
login_response = session.post(f'{qbittorrent_url}auth/login', data=login_data)
if login_response.status_code != 200:
raise Exception('Failed to login to qBittorrent')
# Get torrent file info
file_info_response = session.get(f'{qbittorrent_url}torrents/files?hash={torrent_hash}')
file_info = file_info_response.json()
# Get torrent information to check availability
torrent_info_response = session.get(f'{qbittorrent_url}torrents/properties?hash={torrent_hash}')
torrent_info = torrent_info_response.json()
# Check each file and set priority to "Do Not Download" if availability is less than 100%
for index, file in enumerate(file_info):
if torrent_info['availability'] < 1.0:
payload = {
'hash': torrent_hash,
'id': index,
'priority': 0 # 0 is "Do Not Download"
}
set_priority_response = session.post(f'{qbittorrent_url}torrents/filePrio', data=payload)
if set_priority_response.status_code != 200:
print(f'Failed to set priority for file: {file["name"]}')
else:
print(f'Set file to "Do Not Download": {file["name"]}')
I like the idea. Would you be willing to help testing if I added it to the dev-version?
Yeah, with pleasure.
On a second thought.. how would this work? If we put the files that wont complete to ‚dont download‘ but wait gor the rest to complete, once they are done, sonarr will import and then add a new torrent, but it may be the same broken torrent
we cant blocklist because that would also stop the feasible files…
thoughts?
Very good point, Solution could be to create an array with graylisted torrents. After downloaded is completed if it’s added again to BitTorrent, blacklist it straight away. Not a pretties solution bot should be enough to cycle thru partial download and import.
Thought of that as well but once imported by sonarr, the item disappears from the queue. only things on the queue can be blocklisted snd thus the graylist would point to nowhere…
Yes, and it can be blacklisted when sonar decided to add broken torrent to queue for second time.
At least in my logic I see it this way: -torrent is added -script declutter detect it has only some files available -files are set not to download -torrent is added to array -download completes and imported -sonar will pick it up again and add to queue -declutter compares any torrent to array -if in array blacklist it straight away and removes it from array not to create buffer overflow in future.
array could be holding name+size hashed
Feels too hacky for me, tbh. In my view, there should not be a "shadow" blocklist; there should only be one "truth", which is the Sonarr blocklist.
Items get added to the blocklist by the DELETE request sent to the /queue endpoint. At that point, the users have a few options, ie., whether to remove the files from the download client etc.
It there was an additional option "remove from queue" (obv. true by default) but can be false, that would solve this problem no? This would mean that an item on the queue can be already send to the blocklist, but the item is not removed from the queue. We can then set the files that will never download to "Don't Download", hence when the others finish, the queue item completes, but it would not re-add the same torrent.
What I haven't fully thought through is what happens when we blocklist while files 1-5 are still downloading and we mark 6-10 as don't download, and we trigger a "re-scan" (part of the blocklisting); could this potentially mean that we add another torrent for 1-5, too?
And the obv; depends on the additonal option to be added to Sonarr..
Here's the API where that "removeFromQueue" could be added
If we can think of another way that is robust, that'd be preferred...
So I had a go and tried to code it up in Sonarr directly. Based on the draft PR, here's some additional thoughts:
When you currently mark items as "Do not download" in qbit (say, 3 out of 10), and the other 7 are at some point completed, making the entire download complete: what happens in Sonarr? Based on Markus' feeback, my take would be that even the 7 "survivors" would no be imported. Is that the case?
When you have marked 3/10 as "do not download" and then you go to Sonarr and you manually trigger a re-search for these items, i) does Sonarr actually add downloads for those items (or does it skip it, because the season pack is still in the queue)? ii) and if it does add downloads, does it happen to re-add the same torrent as it already had before (the season pack)?
https://github.com/Sonarr/Sonarr/pull/7115#issuecomment-2295382951
The only way you'll be able to handle this is manually. Sonarr won't know that you don't want specific episodes in the season pack and that it needs to find them elsewhere. At best you could wait for the downloaded episodes to import, and then tell Sonarr to remove the item and blocklist it, but the blocklisting wouldn't really matter since Sonarr wouldn't grab a season pack for less than the full season being needed.
Here's a thought: Above reads to me like season packs don't get grabbed when some episodes are already available. Additioanlly, imports are not started into sonarr if not the entire download is complete. if we..
don't download
if you think it makes sense, could you please test the following:
So I tried my best to do testing,
I set manually files not to download
Sonarr managed to import one grabbed file, and than proceed to download rest from other sources
(couldn't catch that as it was at night) It's hard to test it, if we could implement it as a dev release it would be much easier to find any issues.
Thanks Just to be 100% sure I understand you correctly:
the episode from the season pack was automatically imported? yes
The other episodes were searched for automatically? (And if yes, when were they searched for? When the season pack was imported or already when you had marked the items them as dont download) Yes the other seemed to searched automatically however i cannot pinpoint exact time when as it finished at night. Some more testing will be required.
Just pushed it to dev image. You need to turn on CANCEL_UNAVAILABLE_FILES. Thanks for testing
first finding: Do not download is set also on seeding torrents. not sure if it's a issue or not.
also, if only crap files are available (.nfo), download will successful but will fail to import.
additionally: if there is only one file, it is set to not download and still tried to import
Obviously it's being imported as broken.
first finding: Do not download is set also on seeding torrents. not sure if it's a issue or not.
Whats the Api response on /torrents/files for one seeding torrent?
Obviously it's being imported as broken.
Does sonarr somehow tell you that it‘s broken? (of course I understand it's broken, but I want to know if that is somewhere visible in sonarr)
and what happens to the overall torrent availability after the files not available are set to not download? Does it go to 100% or does it still take into account those that were taken off?
Can you please also try if anything changes to the Sonarr behavior if you turn on the setting in qbit to " Keep unselected files in ".unwanted" folder".
In theory, if the .unwanted is active, this PR made sure they are ignored for download: https://github.com/Sonarr/Sonarr/issues/2072
Ping?
Hey, sorry on holidays now. Will do test's ASAP when back.
Would it be possible for decluttarr to check if at least one of peers have 100% of files? And if no, blacklist it. 70% of files are getting stack at 99.8% with drives me crazy.
Thanks!