Open TonyBagnall opened 2 weeks ago
I'll try to cover all shapelet classifiers and transformers in #1703, so it should solved by the associated PR. feature importance is attached to the feature generated by individual shapelets, and obtained on the fitted classifier. For example feature importance on forests and weights associated to each class in linear models like ridge.
Describe the feature or idea you want to propose
the SAST (scalable and accurate subsequence transform) classifier has a public function plot_most_important_feature_on_ts which has no example and is not tested. Its not clear to me what the
feature_importance
parameter should be (number of shapelets? number of time points) and this should be generalisable to all.Describe your proposed solution
Any thoughts @frankl?
Describe alternatives you've considered, if relevant
No response
Additional context
No response