Closed Veldhoen closed 4 months ago
In https://github.com/beeldengeluid/dane-visual-feature-extraction-worker/blob/main/nn_models.py#L223 an extra bit (fc - linear function) is added to the Visualnet - when reconstructing just the visualnet, Torch complains about these values being in the checkpoint which are not defined in the Visualnet definition (https://github.com/beeldengeluid/dane-visual-feature-extraction-worker/blob/main/nn_models.py#L13)
Not sure how to proceed: Should I add it there (which introduces backwards compatibility issues)? Or discard it from the checkpoint (which also impacts backwards compatibility, as the model output would be different)?
Same goes for final layers in AVNet: linear/relu/linear layers are added (lin1/relu/lin2) - should I keep these?
In https://github.com/beeldengeluid/dane-visual-feature-extraction-worker/blob/main/nn_models.py#L223 an extra bit (fc - linear function) is added to the Visualnet - when reconstructing just the visualnet, Torch complains about these values being in the checkpoint which are not defined in the Visualnet definition (https://github.com/beeldengeluid/dane-visual-feature-extraction-worker/blob/main/nn_models.py#L13)
Not sure how to proceed: Should I add it there (which introduces backwards compatibility issues)? Or discard it from the checkpoint (which also impacts backwards compatibility, as the model output would be different)?
Same goes for final layers in AVNet: linear/relu/linear layers are added (lin1/relu/lin2) - should I keep these?
The fc layer was never used in forward pass, so just I just leave it out of checkpoint for visualnet.