Open BrianMozy opened 2 weeks ago
I encountered similar problem, too, but during the main training, not initialisations.
With HMM, sometimes poor initialisations may lead to some states having near-zero or zero posterior probabilities. One way to confirm this is to print out state fractional occupancies after the training. If there is a state that has zero or very low fractional occupancy, then this is the case. Maybe try reducing the sequence_length
, since it can alleviate numerical underflow problem.
If you can try using different random seeds and identify which one reproduces this error, we could debug it better.
It might be due to the initial covariances being chosen should that you're more sensitive to what Sungjun described.
Can you run a couple tests:
I met a bug that when training HMM, occasionally the initialisation will fail, showing error like this:
However, with the same training config, most of the time everything is just fine. And please kindly find my hmm training script attached.
train_hmm.txt