luigibonati / mlcolvar

A unified framework for machine learning collective variables for enhanced sampling simulations
MIT License
91 stars 24 forks source link

Make last layer activation of committor an explicit block #135

Closed EnricoTrizio closed 3 months ago

EnricoTrizio commented 3 months ago

Description

To allow easy deactivation of the last layer sigmoid activation, it would be better to treat it as a block or as a post-processing.

See also #134

Status

codecov[bot] commented 3 months ago

Codecov Report

All modified and coverable lines are covered by tests :white_check_mark:

Project coverage is 92.51%. Comparing base (f5598d5) to head (0d85fde).

Additional details and impacted files
EnricoTrizio commented 3 months ago

Any reason to keep the underscore in Custom_Sigmoid and Shifted_Softplus?

Shifted_Softplus has always been that way, so I just followed that.
We can also change it, it shouldn't be a problem. It only appears in nn.utils and in some notebooks' output cells in the model summaries.

andrrizzi commented 3 months ago

I see, thanks. In principle, if Shifted_Softplus was in a previous release, we'd have to deprecate it and remove it only when we'll release 2.0. If that's the case, we could just change their name and do

class Shifted_Softplus(ShiftedSoftplus):
    def __init__(...):
        warn('Shifted_Softplus is deprecated use ShiftedSoftplus instead.')
        super().__init__(...)