I'm trying to run the test code test_wddgan.py and have encountered a problem. I tested the pretrained checkpoint netG_475.pth (celeba_256). It reported the error below when loading celebahq_stat.npy. I also met this problem when I tried to compute the FID score with the pretrained checkpoint of celeba_512.
array = pickle.load(fp, **pickle_kwargs)
_pickle.UnpicklingError: pickle data was truncated
Then I tried to recalculate the npz file of celeba_256 with scripts/precompute_fid_statistics.py of NVAE and test the negG_475.pth. The npz file could be loaded but the FID result of the model turned out abnormally large. I wonder how I should deal with the error Thanks!
I'm trying to run the test code test_wddgan.py and have encountered a problem. I tested the pretrained checkpoint netG_475.pth (celeba_256). It reported the error below when loading celebahq_stat.npy. I also met this problem when I tried to compute the FID score with the pretrained checkpoint of celeba_512.
Then I tried to recalculate the npz file of celeba_256 with scripts/precompute_fid_statistics.py of NVAE and test the negG_475.pth. The npz file could be loaded but the FID result of the model turned out abnormally large. I wonder how I should deal with the error Thanks!