ragulpr / wtte-rnn

WTTE-RNN a framework for churn and time to event prediction
MIT License
762 stars 186 forks source link

Extracting feature importance #35

Open adam-haber opened 6 years ago

adam-haber commented 6 years ago

Is it possible to extract "feature importance" from an WTTE-RNN model?

ragulpr commented 6 years ago

In theory, feature importance for neural networks is a tricky question and in general there's no nice tricks/clear winning strategy as with trees that I know about (but I know very little about the subject).

There's some tricks ranging from advanced plotting to checking the derivative wrt to inputs or maybe training with dropout and shutting of certain inputs and evaluating on testset. Maybe this question can get you started.

In practice, with my experiences a more informal analysis can be helpful using wtte-rnn by just eyeballing plots of sequences of predictions and features lined up and (depending on your problem/data) you can sometimes see quite clearly that some data will make an impact on the state of the rnn. This wont help feature selection of course.