Closed tijyojwad closed 4 years ago
@michaelbrownid I created this PR to address your issue, and also incorporated the other dependency and doc updates you had in your branch.
Can you try the changes out in your branch and let me know if your issues are resolved?
Thanks!
Fixing issue raised in #136 that was causing inference to fail
Root cause of issue is that the softmax was applied twice in the initial version, which was fixed in previous PR. however, that led to an issue where the network output logits instead of probabilities which led to a failure in quality score generation for consensus. This is now fixed by making the call to softmax optional in the networks final layer, so it can be enable for inference only.