EthanRosenthal / spacecutter

Ordinal regression models in PyTorch
https://www.ethanrosenthal.com/2018/12/06/spacecutter-ordinal-regression/
MIT License
136 stars 16 forks source link

Under OrdinalModule #5

Open jaideep11061982 opened 4 years ago

jaideep11061982 commented 4 years ago

Hi could u help understand the below peace of code. Why do we subtract the elements in linkmat ,then concatinating them . Isnt just cutpoints-X is sufficient ?

sigmoids=cutpoints-X
 link_mat = sigmoids[:, 1:] - sigmoids[:, :-1]
        link_mat = torch.cat((
                sigmoids[:, [0]],
                link_mat,
                (1 - sigmoids[:, [-1]])
            ),
            dim=1

2) when does this AscensionCallback gets called up.. start of every batch,epoch,or end of batch or epoch

EthanRosenthal commented 4 years ago

The link_mat comes from the middle line of this equation (from my blog post on this)

image

Adjacent cutpoints have to be subtracted from each other.

The AscenscionCallback gets called at the end of every batch.

jaideep11061982 commented 4 years ago

NOt sure why but my loss keeps circulating around few values so model dsn converges

self.link = LogisticCumulativeLink1(6,
                                           init_cutpoints='ordered')
def forward(self,  x):

        x = self.enc(x)

        x = self.head(x) # it output dim is 1 
        x=self.link(x)

        return x

loss=CumulativeLinkLoss()

EthanRosenthal commented 4 years ago

Sorry I can't really help debug -- it could have something to do with the library or be something else. Perhaps you can try training as a regular regression model to see if you can fit that. If you can, then maybe it's something to do with ordinal regression, specifically.