12wang3 / rrl

The code of NeurIPS 2021 paper "Scalable Rule-Based Representation Learning for Interpretable Classification" and TPAMI paper "Learning Interpretable Rules for Scalable Data Representation and Classification"
MIT License
107 stars 33 forks source link

Can RRL be used for regression? #1

Closed KongMingxi closed 3 years ago

KongMingxi commented 3 years ago

Hi authors,

Thanks for this great work! I think it is very helpful for data analysis. I wonder if it can be used for regression and how to perform that?

Best,

Mx

12wang3 commented 3 years ago

Hi KongMingxi,

RRL can be used for regression because the last layer is a linear layer (FC layer). To do regression, you just need to change the loss function, i.e., delete the softmax and use the loss function designed for regression. It should be noted that you need to calculate $\frac{\partial Loss()}{\partial \bar{Y}}$ like the following code:

https://github.com/12wang3/rrl/blob/656098b4d5d009f5bd236749f831529bf78dead2/rrl/models.py#L179

What's more, the regression performance of RRL is not verified by experiments, so I'm not sure if RRL can get a good result.

KongMingxi commented 3 years ago

Many thanks for your reply!