Closed souryadey closed 2 years ago
This happens due to numerical instabilities.
Possible cause:
The dominant eigenvalue has magnitude significantly greater than 1.
Capturing this:
The magnitude of the dominant eigenvalue is recorded in the log file after each epoch.
Possible cause:
Training data is not normalized.
Solution:
Set normalize_Xdata = True
in config.py
. This is the default.
Possible cause:
StatePred.rank
is too high.
Solution:
Reduce rank to $<$ encoded_size
. This is because rank is supposed to reduce the effective number of encoded states to a lower value than encoded_size
. (Note that numerical issues can also occur if rank is higher than any of the numbers in encoder_hidden_layers
).
Running
StatePred.train_net()
producesRuntimeError: linalg_eig_backward: The eigenvectors in the complex case are specified up to multiplication by e^{i phi}. The specified loss function depends on this quantity, so it is ill-defined
.