Closed WilliamCancino closed 3 years ago
Did you not have any errors reported along the way in Google Collab? I don't recall us trying to run the code in collab, only on a local machine, following the instructions in the README.
Yes, I did get some errors but they were not significant.
These were the modifications I made per file:
nn.py
import tensorflow as tf
was changed to import tensorflow.compat.v1 as tf
.nn_evaluate.py
hdf5 = hdf5_handler("./data/abide.hdf5", "a")
was changed to hdf5 = hdf5_handler(bytes("./data/abide.hdf5",encoding="utf8"), 'a')
.results.append(nn_results(hdf5, experiment, code_size_1, code_size_2))
was changed to results.append(nn_results(hdf5, experiment[0], code_size_1, code_size_2))
.print(df)
.With these changes you can run the code without errors.
@anibalsolon did you ever try using this in Google Collab?
Not really, but that's an odd behavior... I guess it depends on how the code is being run, were you able to fully train the model? The model is stored in Tensorflow checkpoints btw, not in the hdf5 file.
Yes, I managed to fully train him on Google Collab. Also, I tested the code on Ubuntu following the same modifications I mentioned above and it works properly, i.e. it does not predict all cases as ASD (0). So, the problem is in Google Collab.
William, alas, we do not have much time to try and diagnose why Collab is giving you grief. If you manage to sort this problem out, we would be very grateful for a pull request with the fix, but we do not have the bandwidth to sort this out.
Hi,
I am writing because I have reviewed and tested your code following the "cc200" derivation with 10 folds on the entire data set. However, at the time of evaluation all cases are predicted as ASD (0). Also, I have tried other configurations obtaining the same result. I really don't understand why this is happening. It seems that the weights and final biases of the autoencoders are not being saved. I have run all the tests from Google Colaboratory. I would appreciate any instructions or suggestions on this.
Thank you very much!