Open jaideep11061982 opened 4 years ago
NOt sure why but my loss keeps circulating around few values so model dsn converges
self.link = LogisticCumulativeLink1(6,
init_cutpoints='ordered')
def forward(self, x):
x = self.enc(x)
x = self.head(x) # it output dim is 1
x=self.link(x)
return x
loss=CumulativeLinkLoss()
Sorry I can't really help debug -- it could have something to do with the library or be something else. Perhaps you can try training as a regular regression model to see if you can fit that. If you can, then maybe it's something to do with ordinal regression, specifically.
Hi could u help understand the below peace of code. Why do we subtract the elements in linkmat ,then concatinating them . Isnt just cutpoints-X is sufficient ?
2) when does this AscensionCallback gets called up.. start of every batch,epoch,or end of batch or epoch