This PR adds multimodal MIL support, with a new MIL model MultiModal_Attention_MIL. Multimodal MIL support enables users to use multiple feature extractors simultaneously at differing magnifications.
Enabling multimodal MIL
Multimodal MIL support is automatically enabled by choosing a multimodal model when setting up an MIL configuration with sf.mil.mil_config(). The model 'mm_attention_mil' is auto-detected as an MIL model.
If you are training a custom module and would like multimodal support, create a parameter is_multimodal for the module and set it to True. E.g.
class CustomMIL(torch.nn.Module)
is_multimodal = True
Bags for multimodal models
Multimodal models require two or more bag sources. Instead of passing a single bag directory to the training or evaluation functions, pass multiple directories.
When exporting attention values, either during training or evaluation, multimodal MIL models will save each mode's attention as a separate array in the same *.npz file
For example, the attention values for mode 1 and mode 2 for slide "SLIDE_A" can be accessed from the *.npz files generated during training using the following syntax:
Multimodal MIL models are fully supported in Slideflow Studio. Attention heatmaps for each mode can be interactively viewed and saved.
At present, multimodal attention cannot be exported during evaluation with Project.evaluate_mil(..., attention_heatmaps=True); the only method for viewing multimodal attention is Slideflow Studio. Automatic export of multimodal attention heatmaps during evaluation is planned for a future release.
Other updates
This PR includes several other related smaller updates, including:
Press ALT while viewing a heatmap in Slideflow Studio to show the heatmap value underneath the mouse cursor.
New experimental UQ support for MIL models through MC Dropout estimation. Enable by passing uq=True to P.train_mil() or P.evaluate_mil().
Expand MIL configuration support to log arbitrary model keyword arguments, for greater reproducibility of custom MIL parameters and architectures.
New 'temperature' parameter for Attention_MIL, which controls the attention softmax temperature.
New attention_gate parameter for Attention_MIL, for experimental attention gating.
This PR adds multimodal MIL support, with a new MIL model
MultiModal_Attention_MIL
. Multimodal MIL support enables users to use multiple feature extractors simultaneously at differing magnifications.Enabling multimodal MIL
Multimodal MIL support is automatically enabled by choosing a multimodal model when setting up an MIL configuration with
sf.mil.mil_config()
. The model'mm_attention_mil'
is auto-detected as an MIL model.Custom models
If you are training a custom module and would like multimodal support, create a parameter
is_multimodal
for the module and set it toTrue
. E.g.Bags for multimodal models
Multimodal models require two or more bag sources. Instead of passing a single bag directory to the training or evaluation functions, pass multiple directories.
Exporting attention
When exporting attention values, either during training or evaluation, multimodal MIL models will save each mode's attention as a separate array in the same *.npz file
For example, the attention values for mode 1 and mode 2 for slide "SLIDE_A" can be accessed from the *.npz files generated during training using the following syntax:
Attention heatmaps
Multimodal MIL models are fully supported in Slideflow Studio. Attention heatmaps for each mode can be interactively viewed and saved.
At present, multimodal attention cannot be exported during evaluation with
Project.evaluate_mil(..., attention_heatmaps=True)
; the only method for viewing multimodal attention is Slideflow Studio. Automatic export of multimodal attention heatmaps during evaluation is planned for a future release.Other updates
This PR includes several other related smaller updates, including:
uq=True
toP.train_mil()
orP.evaluate_mil()
.Attention_MIL
, which controls the attention softmax temperature.attention_gate
parameter forAttention_MIL
, for experimental attention gating.DatasetFeatures.from_bags()
class method