feedzai / timeshap

TimeSHAP explains Recurrent Neural Network predictions.
Other
162 stars 30 forks source link

Can we wrap any other model than RNN like for transformer networks? #46

Closed ajinkyakulkarni14 closed 1 year ago

ajinkyakulkarni14 commented 1 year ago

Is the wrapper of TimeSHAP is DNN network agnostic? like for the following functions,

model_wrapped = TorchModelWrapper(model) f_hs = lambda x, y=None: model_wrapped.predict_last_hs(x, y)

Can we apply same to Transformer network?

JoaoPBSousa commented 1 year ago

Hello @ajinkyakulkarni14 ,

TimeSHAP is model-agnostic, which means it doesn't impose any restrictions on the explained model architecture. You can indeed explain a Transformer network with it.

If you have more questions, please don't hesitate to ask here or open a new issue.

ajinkyakulkarni14 commented 1 year ago

Thank you for quick response.