Thealexbarney / VGAudio

A library for encoding, decoding, and manipulating audio files from video games.
MIT License
219 stars 37 forks source link

System.ArgumentNullException on large batch processing sets #98

Open sharrken opened 6 years ago

sharrken commented 6 years ago

When processing large numbers of files with the batch process option, I'm having repeatable failures which halt processing.

Error as follows:

[#-------------------] 28525/532893 5.4% \System.ArgumentNullException: Value cannot be null.
   at System.Threading.Monitor.Enter(Object obj)
   at VGAudio.Cli.ProgressBar.TimerHandler(Object state)
   at System.Threading.ExecutionContext.RunInternal(ExecutionContext executionContext, ContextCallback callback, Object state, Boolean preserveSyncCtx)
   at System.Threading.ExecutionContext.Run(ExecutionContext executionContext, ContextCallback callback, Object state, Boolean preserveSyncCtx)
   at System.Threading.TimerQueueTimer.CallCallback()
   at System.Threading.TimerQueueTimer.Fire()
   at System.Threading.TimerQueue.FireNextTimers()

Command:

VGAudioCli.exe -b -r -i "C:\in\" -o "C:\out\" --out-format wav

OS: Win 10 Pro Workstation, 1803, 17134.81

Data is ~550,000 ATRAC9 files being converted to PCM 16-bit WAV's. At lower batch sizes - around 50,000 - I am still getting the exact same errors, but inconsistently - some of the time it will complete successfully, sometimes fail. On the full set, I usually get the error ~3-5% into the set, on the 50,000 subset I have had errors ~60%, but also successes.

Thealexbarney commented 6 years ago

If I'm reading this correctly, the error occurs on the full set 100% of the time?

sharrken commented 6 years ago

In my experience yes, but statistically probably not.

Basically, it seems that on a large batch, this exception has X probability of happening. With the 50,000 file sub-sets I can, by repeating the set multiple times, get it to process all files successfully without the exception occurring - on some sub-sets this is taking ~5-10 runs though, with failures anywhere from 0.9% into the high 80%'s. This may mean that the exception has a chance of happening even on small batches, but the probability is so low you will never encounter it on smaller batch sizes, or it may only come into play once a batch is beyond a certain size.

Obviously with the way probabilities work, I could possibly get the full 550,000 set to complete if I gave it enough runs, but the likelihood is small enough it is more practical to run the sub-sets even at a 1:10 success rate than run the full 550,000 at a 1:100 or 1:1000 or whatever it is.