Open letmejoin opened 3 weeks ago
Hi @letmejoin ,
I had the same problem and updating to version 1.2.0dev and retraining in that version solved the issue. I hope it helps!
Cheers
Hi @letmejoin ,
I had the same problem and updating to version 1.2.0dev and retraining in that version solved the issue. I hope it helps!
Cheers
Sorry for the late reply as I was busy with other things. But after the anomalib version update to 1.2.0.dev, the problem still exists. Is your timm version 1.0.3?
Describe the bug
After upgrading timm from 0.6.13 to 1.0.3, the model that could be inferred reported an error. It was found that the return value of the feature extraction part was missing member variables. The following is the specific code location. It may be that only the method was called without initialization, resulting in the failure to load all member variables. member variables of self.feature_extractor https://github.com/openvinotoolkit/anomalib/blob/22caf3badf610641c6b0d4f7ba5d6e1b1e419ce8/src/anomalib/models/components/feature_extractors/timm.py#L124
['T_destination', '__annotations__', '__call__', '__class__', '__contains__', '__delattr__', '__delitem__', '__dict__', '__dir__', '__doc__', '__eq__', '__format__', '__ge__', '__getattr__', '__getattribute__', '__getitem__', '__getstate__', '__gt__', '__hash__', '__init__', '__init_subclass__', '__iter__', '__le__', '__len__', '__lt__', '__module__', '__ne__', '__new__', '__reduce__', '__reduce_ex__', '__repr__', '__setattr__', '__setitem__', '__setstate__', '__sizeof__', '__str__', '__subclasshook__', '__weakref__', '_apply', '_backward_hooks', '_backward_pre_hooks', '_buffers', '_call_impl', '_collect', '_compiled_call_impl', '_forward_hooks', '_forward_hooks_always_called', '_forward_hooks_with_kwargs', '_forward_pre_hooks', '_forward_pre_hooks_with_kwargs', '_get_backward_hooks', '_get_backward_pre_hooks', '_get_name', '_is_full_backward_hook', '_load_from_state_dict', '_load_state_dict_post_hooks', '_load_state_dict_pre_hooks', '_maybe_warn_non_full_backward_hook', '_modules', '_named_members', '_non_persistent_buffers_set', '_parameters', '_register_load_state_dict_pre_hook', '_register_state_dict_hook', '_replicate_for_data_parallel', '_save_to_state_dict', '_slow_forward', '_state_dict_hooks', '_state_dict_pre_hooks', '_version', '_wrapped_call_impl', 'act1', 'add_module', 'apply', 'bfloat16', 'bn1', 'buffers', 'call_super_init', 'children', 'clear', 'compile', 'concat', 'conv1', 'cpu', 'cuda', 'default_cfg', 'double', 'dump_patches', 'eval', 'extra_repr', 'feature_info', 'float', 'forward', 'get_buffer', 'get_extra_state', 'get_parameter', 'get_submodule', 'half', 'ipu', 'items', 'keys', 'layer1', 'layer2', 'layer3', 'load_state_dict', 'maxpool', 'modules', 'named_buffers', 'named_children', 'named_modules', 'named_parameters', 'parameters', 'pop', 'pretrained_cfg', 'register_backward_hook', 'register_buffer', 'register_forward_hook', 'register_forward_pre_hook', 'register_full_backward_hook', 'register_full_backward_pre_hook', 'register_load_state_dict_post_hook', 'register_module', 'register_parameter', 'register_state_dict_pre_hook', 'requires_grad_', 'return_layers', 'set_extra_state', 'set_grad_checkpointing', 'share_memory', 'state_dict', 'to', 'to_empty', 'train', 'training', 'type', 'update', 'values', 'xpu', 'zero_grad'] ['__class__', '__delattr__', '__dir__', '__doc__', '__eq__', '__format__', '__ge__', '__get__', '__getattribute__', '__gt__', '__hash__', '__init__', '__init_subclass__', '__le__', '__lt__', '__ne__', '__new__', '__reduce__', '__reduce_ex__', '__repr__', '__self__', '__self_class__', '__setattr__', '__sizeof__', '__str__', '__subclasshook__', '__thisclass__', '_backward_hooks', '_backward_pre_hooks', '_buffers', '_forward_hooks', '_forward_hooks_always_called', '_forward_hooks_with_kwargs', '_forward_pre_hooks', '_forward_pre_hooks_with_kwargs', '_is_full_backward_hook', '_load_state_dict_post_hooks', '_load_state_dict_pre_hooks', '_modules', '_non_persistent_buffers_set', '_parameters', '_state_dict_hooks', '_state_dict_pre_hooks', 'concat', 'default_cfg', 'feature_info', 'pretrained_cfg', 'return_layers', 'training']
But if addself.feature_extractor.set_grad_checkpointing(False)
before https://github.com/openvinotoolkit/anomalib/blob/22caf3badf610641c6b0d4f7ba5d6e1b1e419ce8/src/anomalib/models/components/feature_extractors/timm.py#L123, It works fine, I don't know if there is a better way.timm code: `class FeatureDictNet(nn.ModuleDict): """ Feature extractor with OrderedDict return
class FeatureListNet(FeatureDictNet): """ Feature extractor with list return
Dataset
N/A
Model
N/A
Steps to reproduce the behavior
update timm to 1.0.3
OS information
OS information:
Expected behavior
get correct results as before
Screenshots
Pip/GitHub
pip
What version/branch did you use?
No response
Configuration YAML
Logs
Code of Conduct