Closed cwmeijer closed 2 years ago
Yang mentioned the following during our standup Two potential candidates: Sax Fuzzy logic
The authors from the paper given in the top post (Rojat et al. 2021) studied many methods, which try to provide XAI on time series data. They have a table on page 9 (see below), in which they summarize the methods they inspect in their work.
To satisfy our requirements, the method must be model agnostic, post-hoc and non-backpropagation based. Only a few methods in the paper pass this line and most of them are perturbation/feature occlusion based. I checked these XAI methods. In my opinion, none of them are extraordinaire enough to make me feel they deserve a "must have" label as a candidate for our package.
Two methods which draw my attention, are SAX and fuzzy logic, since they are designed specifically for timeseries. Unfortunately, these two methods are Ante-hoc approach and they were not created to work with deep learning.
To summarize, at least from this paper, I don't think there is any good method we shall never miss. I plan to check some blog posts and other papers, just to see if there is anything of our interest.
Nice analysis and summary @geek-yang! I've also looked at some other sources:
They both provide post-hoc local inter- pretability, which is often considered unfaithful. The only difference between them is that CAM explanations provide both feature importance and the occurring time stamps, while LEFTIST provide feature importance at prefixed time stamps.
The above made me search for LEFTIST and I will put my findings in the LIME for TS issue #342
Following the chain of issues, this is now continued in #369
Link to the Rojat et al. 2021 that Elena sent: https://arxiv.org/pdf/2104.00950.pdf