fenghuibao / dSort-Seq

0 stars 0 forks source link

Tensorflow OUT_OF_RANGE: End of Sequence error #1

Closed stevechen123 closed 1 week ago

stevechen123 commented 1 week ago

Hi,

First of all, thanks for creating this tool for analyzing Sort-Seq data. From the paper, it sounds like a wonderful tool for analyzing Sort-Seq data with precision, which I am excited to potentially apply towards my Sort-Seq experiments.

However, after downloading your script, I was able to run it using the provided example dataset. The run proceeds without issue until about 850-870 iterations, where it produces an OUT_OF_RANGE: End of sequence (see attached). The run then terminates. Any information on how to overcome this error is appreciated. Thanks in advance.

Screenshot 2024-09-26 at 7 32 10 PM
fenghuibao commented 1 week ago

Hi Steve,

Please refer to this link: https://stackoverflow.com/questions/53930242/how-to-fix-a-outofrangeerror-end-of-sequence-error-when-training-a-cnn-with-t. In my case, I used 2,781,128 cytometry points (which should be 5,562,256 for repeat 2). Given 6400 per batch and 869 iterations, the total data processed by the script would be 5,561,600, which is very close. However, I didn't encounter this issue before since I had already set drop_reminder = True.

Additionally, although the program terminates at the last batch, the results with the smallest dist_loss should already be saved, so this shouldn't impact any further analysis.

I'll try running it again on my computer and will adjust the script as needed. Thanks for pointing it out!

Best, Huibao

stevechen123 commented 1 week ago

Hi Steve,

Please refer to this link: https://stackoverflow.com/questions/53930242/how-to-fix-a-outofrangeerror-end-of-sequence-error-when-training-a-cnn-with-t. In my case, I used 2,781,128 cytometry points (which should be 5,562,256 for repeat 2). Given 6400 per batch and 869 iterations, the total data processed by the script would be 5,561,600, which is very close. However, I didn't encounter this issue before since I had already set drop_reminder = True.

Additionally, although the program terminates at the last batch, the results with the smallest dist_loss should already be saved, so this shouldn't impact any further analysis.

I'll try running it again on my computer and will adjust the script as needed. Thanks for pointing it out!

Best, Huibao

Hi Huibao,

Thank you for the prompt reply.

Yes, I have read the stackoverflow page before posting the issue. Your original script that I ran contained the drop_remainder=True flag but threw the error nevertheless, which lead me to post this issue.

Your explanation using the number of cytometry data points was very useful and makes a lot of sense. Since the number of iterations completed is sufficient to retain the script's intended analysis purpose, I will close this issue.

Once again, thank you for your help! Steve