Closed shinji257 closed 7 years ago
So just as a point of reference. I'm pretty sure after about 5-10 episodes (or so... it depends) memory usage from python would balloon to 4GB usage and stop downloading anything at all. After applying this one line patch it has pulled close to 20 episodes and memory usage is maintaining 30-40MB flat.
Worse case scenario this patch won't have an effect. It doesn't run until file cleanup is done at the end of the episode run and if this leak didn't exist before than this line won't do anything.
I was wrong. Python starts inflating when the process gets stuck in downloading. When I break it off it throws this but I'm not sure if there is an easy workaround here.
Traceback (most recent call last):
File "C:\Users\paperspace\Desktop\Crunchyroll-XML-Decoder-master\crunchy-xml-decoder.py", line 369, in <module>
makechoise()
File "C:\Users\paperspace\Desktop\Crunchyroll-XML-Decoder-master\crunchy-xml-decoder.py", line 315, in makechoise
queueu('.\\queue.txt')
File "C:\Users\paperspace\Desktop\Crunchyroll-XML-Decoder-master\crunchy-xml-decoder.py", line 106, in queueu
ultimate.ultimate(line.rstrip('\n'), '', '')
File "crunchy-xml-decoder\ultimate.py", line 294, in ultimate
video_hls(filen, video_input)
File "crunchy-xml-decoder\hls.py", line 112, in video_hls
fetch_streams(output, video)
File "crunchy-xml-decoder\hls.py", line 83, in fetch_streams
copy_with_decrypt(raw, output, video.keys[0], video.media_sequence + n)
File "crunchy-xml-decoder\hls.py", line 69, in copy_with_decrypt
data = input.read(blocksize)
File "crunchy-xml-decoder\hls.py", line 46, in read
data = self.stream.read(min(n, self.file_size - self.offset))
File "C:\Python27\lib\socket.py", line 340, in read
def read(self, size=-1):
KeyboardInterrupt
Terminate batch job (Y/N)? y
I'm testing a possible workaround by restarting the process when the memory error happens but I'll need it to happen again to test it. XD
Add a gc.collect() call after file cleanup is done and before announcing that it is done. I observed that it would fill up ram after several episodes so this is an attempt to prevent the scenario by having Python do a force garbage collection after each episode is done.