Closed weinberz closed 4 years ago
Hello @weinberz,
Thank you for reporting this issue. Unfortunately we are not able to reproduce it.
Does the issue come up if you run one of our examples or is it a completely different run? Are you currently working on the latest master or on the latest release?
Thank you for your help.
This was happening for me both on the latest release and on a fresh install from the latest master commit.
This was NOT on one of the examples - I substituted my own TIFs in. I suppose thinking about it, it could be an issue with the bit-depth of the files used maybe? I'll test this and get back to you.
I get the exact same error with my own TIFs
What are the dimensions of the TIFs you are using? Would it be possible to get such a TIF or maybe even a minimal example showcasing the bug?
Hope we can figure out where this issue is coming from! 🤞
Hello, I could reproduce this with your data and a new installation following the instructions.
Environment details after following the instructions: conda_list_env.txt
Here is a renamed notebook file with the error trace: 01_training_rename.txt
Should the problem lie on my side, please let me know.
Edit: The workaround above does the trick
I've solved this issue by downgrading keras from to 2.3.1 to 2.2.5. I will do some more debugging and update master branch as needed
Will be pushing updated version 0.1.9 to pip. No code changes, just updated README and failsafe setup.py
new version released
My tests with the new version are successful. Thank you.
I generated a fresh N2V installation following the instructions in the readme, for TensorFlow 1.14 and Python 3.6.
When training, right after the first epoch, I would get an AttributeError ("'float' object has no attribute 'item'") from line 302 of nv_standard.py. Changing that line to:
summary_value.simple_value = value
in a local copy of N2V fixed the issue. Not sure if this is due to my setup or something else but thought I'd report.
Please let me know if other info about my setup or N2V config would be helpful!