Closed bigbrother33 closed 5 years ago
The callback in spacecutter
applies the clip function to your pytorch module after each training batch (after you have calculated the loss and stepped the optimizer).
You could implement the callback yourself by creating an ascension_callback
closure and manually applying the clip function:
def ascension_callback(margin=0.0, min_val=-1.0e6):
def _clip(module):
if isinstance(module, LogisticCumulativeLink):
cutpoints = module.cutpoints.data
for i in range(cutpoints.shape[0] - 1):
cutpoints[i].clamp_(
min_val, cutpoints[i + 1] - margin
)
return _clip
callback = ascension_callback()
# In your training loop, do the following:
for data in data_iterator:
# Calculate loss
# Step optimizer
model.apply(callback)
thanks ! I have do it , and it works!!
Actually I dont want to use skorch (more flexible), so how should i change my training code? i dont know how to add callback to my train code ..... Thanks very much !!!