Open albths opened 2 years ago
Hi, @albths . I will investigate the issue, so could you please submit the code snippet to reproduce the problem?
Thanks!
Thanks for the prompt response. I copied your example code for ScoreCam and it only gives me high attention at one corner. However, I just noticed GradCam Plus Plus seems to work. My main question refers to how to visualize the features for both classes in a binary network with only one output node. If I understand correctly, for this purpose you introduced the class "BinaryScore" instead of "CategoricalScore". Let's say class A is dogs and class B is cats. My model output is the probability for dogs (sigmoid). Probability for cats would then be 1-model output. How to set up the scores to visualize features for each class separately?
My code snippet for ScoreCam is analogous to:
scorecam = Scorecam(model) cam = scorecam(score, X, penultimate_layer=-1) f, ax = plt.subplots(nrows=1, ncols=3, figsize=(12, 4)) for i, title in enumerate(image_titles): heatmap = np.uint8(cm.jet(cam[i])[..., :3] * 255) ax[i].set_title(title, fontsize=16) ax[i].imshow(images[i]) ax[i].imshow(heatmap, cmap='jet', alpha=0.5) ax[i].axis('off') plt.tight_layout() plt.show()
Thanks a lot!
Could you also submit the code where creates the BinaryScore
instance.
How to set up the scores to visualize features for each class separately?
The web page below may be helpful.
Thanks!
This is exactly, what I was looking for. Thanks a lot. Somehow didn't see these instructions.
However, it seems that using BinaryScore for negative class (i. e. score = Binary(0.0)) produces very strange images with a grid-like (non-informative) feature representation. Would be grateful for any help on that.
Hi, first of all, thanks for the great job on this repo.
I am presenting a similar issue with just one output (sigmoid). The issue seems to be happening for the K.softmax
, this is because when one value (which is the case) is given, the returned value is always one.
Also, I think the operation of softmax must be over channels instead of logits, the paper says
Where softmax is applied over the results for a class in specific. The solution could be
weights = ([score(p) for p in prediction] for score, prediction in zip(scores, press))
weights = (K.softmax(w,axis=0) for w in weights)
Hi,
First of all, thanks a lot for this great visualization toolbox. I am using Efficientnet to categorize images into a binary score (i. e. only one output neuron). Can you please provide an end-to-end example, how to use the class BinaryScore in this context? SmoothGrad works fine in my case, but somehow GradCam and ScoreCam only yield broken results, even though I copied your example code. Thanks a lot in advance!