mdhiggins / sickbeard_mp4_automator

Automatically convert video files to a standardized format with metadata tagging to create a beautiful and uniform media library
MIT License
1.52k stars 202 forks source link

Error -- Sonarr returned path that does not match directory to be scanned #1396

Closed Rhainland closed 3 years ago

Rhainland commented 3 years ago

Hi, Im encountering this odd error, only on some items - It seems random, and it comes and goes.

ERROR - Sonarr returned path /mnt/download/sonarr/Soul.Mates.S01E01.1080p.WEB-DL.AAC2.0.H.264-CtrlHD that does not match the directory that was requested to be scanned /mnt/download/sonarr/Soul.Mates.S01E04.1080p.WEB-DL.AAC2.0.H.264-CtrlHD. Trying again in 10 seconds.

Im not sure whats going on here, the full path is correct; /mnt/download/sonarr/

Nzbget tells sonarrr to scan Soul.Mates.S01E04 But sonarr returns Soul.Mates.S01E01

How does that work? What could be causing this?

mdhiggins commented 3 years ago

This is a Sonarr bug and what you're seeing is my attempt to compensate for it

For some reason especially when batch processing Sonarr's API occasionally just returns the wrong episode (1 instead of 4) and doesn't import correctly

They have yet to fix it despite it being reported. I try to monitor this based on the API return and initiate a retry.

def accessAPI(url, payload, headers, log, requests, nzbGet, sleep=10, retry=0, maxretry=5):

You can try increasing the sleep time from 10 to see if maybe that helps provide more consistent results but ultimately this is a Sonarr API bug (file is autoprocess/sonarr.py)

mdhiggins commented 3 years ago

Another option is to use completed download handling and have postSonarr handle all the conversion instead of using NZBGetPostProcess.py (don't use both though, one or the other)

Rhainland commented 3 years ago

Another option is to use completed download handling and have postSonarr handle all the conversion instead of using NZBGetPostProcess.py (don't use both though, one or the other)

Thank you for your reply.

The reason i use nzbget to post process, is because i can run 6 jobs simultaneously here, if i used sonarr to post-process, i would be left with 1 job at a time.

Thats a little too slow! :(.

Ill try and increase the timeout, thanks for your response

mdhiggins commented 3 years ago

I haven't done much testing to see if this is fixed on Sonarr V3 but that may also be another thing to try, its still in beta but from what other users have said its quite stable

Rhainland commented 3 years ago

I haven't done much testing to see if this is fixed on Sonarr V3 but that may also be another thing to try, its still in beta but from what other users have said its quite stable

I use sonarrV3, so i can definitely confirm its not fixed :).

Ill say tho, i have not noticed this problem on windows, only after moving my sonarr to linux. Not sure if thats helpful, but yeah! Ill try and fiddle with it.

mdhiggins commented 3 years ago

Keep me posted, if you find a sleep time that seems to fix it let me know and I can bump the default. 10 seemed to work in my testing but it might vary depending on hardware

Rhainland commented 3 years ago

Keep me posted, if you find a sleep time that seems to fix it let me know and I can bump the default. 10 seemed to work in my testing but it might vary depending on hardware

Seems no matter what i do, it just fails sometimes. For some episodes, it seems to fail constantly and never corrects itself.

It seems that your script is what causes nzbget/sonarr to delete the completed download files, at the end of the whole process. Is there a way to alter this? So that the failed files, can be imported using sonarrs manual file import?

mdhiggins commented 3 years ago

What makes you say the script is what causes the delete? Post some logs if you believe that to be the case but there isn't any file deletion code that should be running at that point coming from my end

Rhainland commented 3 years ago

What makes you say the script is what causes the delete? Post some logs if you believe that to be the case but there isn't any file deletion code that should be running at that point coming from my end

I think i worded it really terribly.

When your script fails to import, sonarr detects the download as failed, this causes sonarr to delete the file, that was actually downloaded and converted.

This puts sonarr in a loop, of re-downloading until eventually all possible links are blacklisted, and no import is done in the end.

mdhiggins commented 3 years ago

Does changing the exit code result in better behavior? Without completed download handling enabled I'm surprised you're seeing that behavior

sys.exit(POSTPROCESS_ERROR)

change to sys.exit(POSTPROCESS_NONE) or sys.exit(POSTPROCESS_SUCCESS)

Rhainland commented 3 years ago

Does changing the exit code result in better behavior? Without completed download handling enabled I'm surprised you're seeing that behavior

sys.exit(POSTPROCESS_ERROR)

change to sys.exit(POSTPROCESS_NONE) or sys.exit(POSTPROCESS_SUCCESS)

In which file should i make this change?

Rhainland commented 3 years ago

I made the change in the sonarr section of the nzbgetpostprocess script, ill test it out and let you know! :)

Rhainland commented 3 years ago

This seems to have resolved it. The files that fail to import, are not re-attempted by sonarr, instead i can use manual import to import them

mdhiggins commented 3 years ago

46a1e80e6f75ab6fe89a3f1d40aae51152cb407a

Updated the official repo to include that change

Rhainland commented 3 years ago

@mdhiggins Hey mate.

Not trying to necro a thread here, but i may have some additional info on this problem for you. I dont know if you initially implemented this change, before my post - Due to another user, or yourself, however;

I ran into this issue, because i set up your script wrong, and i only realised the other day because i was annoyed that sonarr wasnt instantly moving files after dl, to a folder on the same mount, instead it took some time.

I had the postprocess script enabled in nzbget, but also in sonarr. Once nzbget told sonarr to scan to import, the import fired off the post process script in sonarr, and this caused the "ERROR - Sonarr returned path" to happen sometimes.

I realised the mistake, as nzbget obviously runs the script, so i disabled it in sonarr. Ive fetched a few hundred episodes since, and not once ran into this error.

Perhaps this is of use, if someone else asks in the future.

mdhiggins commented 3 years ago

You should be using the script in both locations assuming you're using the correct scripts (postSonarr.py for Sonarr and NZBGetPostProcess.py for NZBGet). The NZBGet script is designed to be used without completed download handling enabled and allows the conversion to take place before Sonarr moves the file but final optimizations and tagging are not performed by this script by design and postSonarr.py will handle this. Its ok to use postSonarr.py by itself though in combination with completed download handling enabled as an alternative but this will have the bulk of the conversion performed in the final location.

Rhainland commented 3 years ago

What kind of "final optimizations" am i losing by not running postsonarr?. The fact that sonarr can only run 1 file at a time, significantly slows down the process of bulk adding items, nzbget allows for 6 post processes to run at once - So what am i missing out on, by only running it on nzbget?.

And of course, i dont have to deal with the issue of some content not importing

mdhiggins commented 3 years ago

Tagging and qtfaststart It also doesn't really make sense that postSonarr would impact importing since its only executed after the files are already imported and renamed

If you have some logs for the error you're seeing I'd appreciate taking a look

Rhainland commented 3 years ago

Apart from the error this thread is about in the first place, im not actually having errors here - However with postsonarr running in sonarr, for each episode imported - It can take anywhere between 10-60 seconds per file, most likely due to qtfaststart - Ill see if thats enabled.

With postsonarr disabled, episodes import instantly. Perhaps the issue isnt per say importing, but importing of next episode doesnt happen until post process of the first is finished, my filesystem is pretty busy and so the ideal situation for me would be for the file to be completely post processed BEFORE being moved to the filesystem.

I realise this probably is a little off-topic, i appreciate your response

mdhiggins commented 3 years ago

Yeah that's probably what it is, tagging is a quick process but qtfaststart is slower and has to essentially reorder the entire file

You can use post-opts = -movflags, faststart to force ffmpeg to do the qtfaststart process during its initial conversion (this is all assuming you're converting to mp4 containers, does not apply to mkv). The only reason this isn't the standard behavior is because with older versions of ffmpeg and mutagen tagging after moving the atoms would result in some playback issues though in more recent testing I haven't been able to reproduce this so its mostly for backwards safety/compatibility

qtfaststart is run dynamically so if you run it at the ffmpeg step as above it won't run again and this could allow you to run both scripts and keep tagging which should be comparatively a very fast process

mdhiggins commented 3 years ago

Can you post the ffmpeg command being generated? you didn't forget the comma did you?

Rhainland commented 3 years ago

I did have a syntax error, its resolved now tho.

With qtfasttart its still pretty slow, perhaps just because my filesystem is very busy, in this current test - From the time it began tagging 1 file, until it started tagging the next, 2 minutes elapsed.

With faststart fixed in nzbget now, i think i can just disable postprocess in sonarr and not tag, for plex i dont think its required to tag tvdb/tmdb IDs in metadata, as long as the files are named correctly.

I hope

mdhiggins commented 3 years ago

2 minutes seems like a long time I'm a bit surprised by that, check the log files and see if postSonarr is doing anything else maybe its possible with the right combination of settings things may be getting duplicated (force-convert for example will run the file through ffmpeg again for no reason in your setup)

Rhainland commented 3 years ago

Actually it looks like perhaps a timeout in the script is causing the slowdown.

When tagging finishes, it seems the script causes sonarr to re-scan the episode, which after some time fails with; 21-3-7 21:04:15.0|Debug|postSonarr.sh|2021-03-07 21:04:15 - SonarrPostProcess - INFO - Final state: queued. 21-3-7 21:04:15.0|Debug|postSonarr.sh|2021-03-07 21:04:15 - SonarrPostProcess - ERROR - Rescan command timed out 21-3-7 21:04:15.0|Debug|postSonarr.sh|2021-03-07 21:04:15 - SonarrPostProcess - ERROR - Sonarr monitor status update failed.

This doesnt happen always, seems to be a little random, but when it happens it seems to cause a 2 minute delay for the next process.

Im testing with another bunch of files now, but this may be some problem my end it seems.

The tagging thats done in post process sonarr, is that only metadata like tmdb and tvdb info? No tagging of audio stream languages, or other stuff?

mdhiggins commented 3 years ago

Stream data has to be written at the time of file creation and doesn't depend on metadata from external APIs so yeah that's done during the FFMPEG sweep

Metadata pulled from external APIs is done as the final step

And that timeout that you're posting is Sonarr's API timing out for some reason. The script waits for it to report 'completed' on the rescan request and that appears to be timing out so something is holding up your sonarr rescan queue which should be a quick process

Rhainland commented 3 years ago

Its pretty odd, because nothing seem to be happning between the post process, and the failure, you can see the logs below.

But i think ill deal with no tagging for now.

21-3-7 21:26:53.4|Debug|postSonarr.sh|2021-03-07 21:26:53 - resources.metadata - INFO - Tagging file: Snowfall 2017 - S02E07 - The World Is Yours[WEBDL-1080p.AAC.h264.NTb].mp4. 21-3-7 21:26:54.1|Debug|Api|[GET] /api/series?apikey=(removed) 200.OK (2004 ms) 21-3-7 21:26:58.7|Debug|postSonarr.sh|2021-03-07 21:26:58 - resources.metadata - INFO - Trying to write tags. 21-3-7 21:27:04.0|Debug|postSonarr.sh|2021-03-07 21:27:04 - resources.metadata - INFO - Trying to write tags. 21-3-7 21:27:12.6|Debug|postSonarr.sh|2021-03-07 21:27:12 - resources.metadata - INFO - Tags written successfully using mutagen. 21-3-7 21:27:12.6|Debug|postSonarr.sh|2021-03-07 21:27:12 - resources.mediaprocessor - INFO - Relocating MOOV atom to start of file. 21-3-7 21:27:12.6|Debug|postSonarr.sh|2021-03-07 21:27:12 - qtfaststart - ERROR - This file appears to already be setup for streaming! 21-3-7 21:27:12.6|Debug|postSonarr.sh|2021-03-07 21:27:12 - resources.mediaprocessor - WARNING - QT FastStart did not run - perhaps moov atom was at the start already or file is in the wrong format. 21-3-7 21:27:12.6|Debug|Api|[POST] /api/command: 201.Created (21 ms) 21-3-7 21:27:12.6|Debug|postSonarr.sh|2021-03-07 21:27:12 - SonarrPostProcess - INFO - Sonarr response RescanSeries command: ID 200191 queued. 21-3-7 21:27:12.6|Debug|postSonarr.sh|2021-03-07 21:27:12 - SonarrPostProcess - INFO - Requesting episode information from Sonarr for series ID 1823. 21-3-7 21:27:12.6|Debug|Api|[GET] /api/command/200191: 200.OK (0 ms) 21-3-7 21:27:12.6|Debug|postSonarr.sh|2021-03-07 21:27:12 - SonarrPostProcess - INFO - State: queued. 21-3-7 21:27:16.9|Debug|Api|[GET] /api/wanted/missing?apikey=(removed) 200.OK (53 ms) 21-3-7 21:27:17.0|Debug|Api|[GET] /api/queue?apikey=(removed) 200.OK (1 ms) 21-3-7 21:27:17.9|Debug|Api|[GET] /api/wanted/missing?apikey=(removed) 200.OK (58 ms) 21-3-7 21:27:18.0|Debug|Api|[GET] /api/queue?apikey=(removed) 200.OK (2 ms) 21-3-7 21:27:19.7|Debug|Api|[POST] /api/command?apikey=(removed) 201.Created (2 ms) 21-3-7 21:27:22.6|Debug|Api|[GET] /api/command/200191: 200.OK (0 ms) 21-3-7 21:27:22.6|Debug|postSonarr.sh|2021-03-07 21:27:22 - SonarrPostProcess - INFO - State: queued. 21-3-7 21:27:31.1|Debug|Api|[POST] /api/command: 201.Created (2 ms) 21-3-7 21:27:31.1|Debug|Api|[GET] /api/command/200191: 200.OK (0 ms) 21-3-7 21:27:32.7|Debug|Api|[GET] /api/command/200191: 200.OK (0 ms) 21-3-7 21:27:32.7|Debug|postSonarr.sh|2021-03-07 21:27:32 - SonarrPostProcess - INFO - State: queued. 21-3-7 21:27:41.1|Debug|Api|[GET] /api/command/200191: 200.OK (0 ms) 21-3-7 21:27:42.7|Debug|Api|[GET] /api/command/200191: 200.OK (0 ms) 21-3-7 21:27:42.7|Debug|postSonarr.sh|2021-03-07 21:27:42 - SonarrPostProcess - INFO - State: queued. 21-3-7 21:27:49.0|Debug|Api|[GET] /api/wanted/missing?apikey=(removed) 200.OK (59 ms) 21-3-7 21:27:49.1|Debug|Api|[GET] /api/queue?apikey=(removed) 200.OK (2 ms) 21-3-7 21:27:50.0|Debug|Api|[GET] /api/wanted/missing?apikey=(removed) 200.OK (72 ms) 21-3-7 21:27:50.0|Debug|Api|[GET] /api/queue?apikey=(removed) 200.OK (1 ms) 21-3-7 21:27:51.2|Debug|Api|[GET] /api/command/200191: 200.OK (0 ms) 21-3-7 21:27:51.5|Debug|Api|[GET] /api/system/status?apikey=(removed) 200.OK (1 ms) 21-3-7 21:27:51.7|Debug|Api|[GET] /api/system/status?apikey=(removed) 200.OK (1 ms) 21-3-7 21:27:51.8|Debug|Api|[GET] /api/v3/languageprofile?apikey=(removed) 200.OK (1 ms) 21-3-7 21:27:52.0|Debug|Api|[GET] /api/tag?apikey=(removed) 200.OK (1 ms) 21-3-7 21:27:52.7|Debug|Api|[GET] /api/command/200191: 200.OK (0 ms) 21-3-7 21:27:52.7|Debug|postSonarr.sh|2021-03-07 21:27:52 - SonarrPostProcess - INFO - State: queued. 21-3-7 21:27:54.0|Debug|Api|[GET] /api/series?apikey=(removed) 200.OK (1866 ms) 21-3-7 21:28:01.2|Debug|Api|[GET] /api/command/200191: 200.OK (0 ms) 21-3-7 21:28:02.7|Debug|Api|[GET] /api/command/200191: 200.OK (0 ms) 21-3-7 21:28:02.7|Debug|postSonarr.sh|2021-03-07 21:28:02 - SonarrPostProcess - INFO - State: queued. 21-3-7 21:28:11.2|Debug|Api|[GET] /api/command/200191: 200.OK (0 ms) 21-3-7 21:28:12.7|Debug|Api|[GET] /api/command/200191: 200.OK (0 ms) 21-3-7 21:28:12.7|Debug|postSonarr.sh|2021-03-07 21:28:12 - SonarrPostProcess - INFO - Final state: queued. 21-3-7 21:28:12.7|Debug|postSonarr.sh|2021-03-07 21:28:12 - SonarrPostProcess - ERROR - Rescan command timed out 21-3-7 21:28:12.7|Debug|postSonarr.sh|2021-03-07 21:28:12 - SonarrPostProcess - ERROR - Sonarr monitor status update failed.

mdhiggins commented 3 years ago

d7c8f00c2eee50e08ab6ab4a678c2db6d10d1fd2

Added a rescan option in autoProcess.ini that avoids these extra API calls (which are really more useful for people not converting before the file is passed back to Sonarr/Radarr, so not important in your use case). Should allow you to keep tagging but not get stuck waiting for the queue. Just set rescan = False

You'll need to run the script once to have the new options added to your autoProcess.ini file

EDIT: For clarification, these are in the Sonarr/Radarr sections of the configuration

Rhainland commented 3 years ago

Is this also pushed to your radarr and sonarr docker images? Thats what im using, if i recall, the binaries for script is located within the image and not in the config location that you mount as a volume for the container

mdhiggins commented 3 years ago

Not yet but if you set the environment variable SMA_UPDATE=true for your docker image and restart the container it will update from github as part of the startup routine without needing a new image to be built

Alternatively you can run docker exec sonarr git -C /usr/local/sma pull origin master assuming your container name is sonarr

mdhiggins commented 3 years ago

Also was it getting stuck at that rescan step for every job? I'm almost wondering if the API command that NZBGetPostProcess creates (DownloadedEpisodesScan) to initiate the scan is the one that's holding up the RescanSeries command (since the RescanSeries command happens inside that nzbget DownloadedEpisodesScan command) and so it will remain in queue until the script exits since one won't start until the other finishes

Rhainland commented 3 years ago

Yes, it gets stuck at the rescan at every job. What you say is in line with what im seeing in the UI the last few weeks when ive been looking at this, sonarr will have "Downloaded Episodes Scan" running for a minute or more, and only once that completes does sonarr actually prroceed to the next item in line.

And so even when it doesnt time out, im still looking at probably 1 minute per file.

Ill update my containers tomorrow, and give it a go! Appreciate all your help on this!

mdhiggins commented 3 years ago

Alright I did a little more tests and digging

f65f27d8760ba7808a6c8542d1b2bce08891da1e

While I can't see all the information if I look at all the commands in process and compare the import directory of the current job in process to the created postSonarr/Radarr job and those two directories are the same I can assume its the same job and disable waiting for the queue but still add a rescan and rename to the final queue to ensure the Radarr/Sonarr database are still updated with the final metadata but avoid waiting for the timeout

Tested it on my end and it appears to be working with rescan = True. I'll leave that option just in case people want to disable it entirely but this technique should be more dynamic and the best of both worlds

Rhainland commented 3 years ago

Alright I did a little more tests and digging

f65f27d

While I can't see all the information if I look at all the commands in process and compare the import directory of the current job in process to the created postSonarr/Radarr job and those two directories are the same I can assume its the same job and disable waiting for the queue but still add a rescan and rename to the final queue to ensure the Radarr/Sonarr database are still updated with the final metadata but avoid waiting for the timeout

Tested it on my end and it appears to be working with rescan = True. I'll leave that option just in case people want to disable it entirely but this technique should be more dynamic and the best of both worlds

Hi mate. Sorry for the late reply.

I updated Sonarr tonight with the fix, rescan = true and tested with 10 episodes, no timeouts so that seems good.

The last week i ran with QT on nzbget psot process step, so through ffmpeg and ive had multiple items processed with stuttering video, so i wouldnt recommend anyone else try that.

Ill play around with the build a little more tomorrow, but so far so good - Good speed improvements, hopefully not a huge chance of files getting locked in the end directory now when it doesnt take too long!