Closed iProzd closed 1 month ago
source/tests/universal/pt/backend.py
34-34: Found useless expression. Either assign it to a variable or remove it. (B018)deepmd/dpmodel/descriptor/hybrid.py
204-204: Loop control variable `ii` not used within loop body (B007) Rename unused `ii` to `_ii`deepmd/pt/model/descriptor/hybrid.py
174-174: Loop control variable `des` not used within loop body (B007) Rename unused `des` to `_des` --- 218-218: Loop control variable `ii` not used within loop body (B007) Rename unused `ii` to `_ii`deepmd/dpmodel/descriptor/se_r.py
105-105: Do not use mutable data structures for argument defaults (B006) Replace with `None`; initialize within function --- 109-109: Do not use mutable data structures for argument defaults (B006) Replace with `None`; initialize within function --- 380-380: Local variable `env_mat` is assigned to but never used (F841) Remove assignment to unused variable `env_mat`deepmd/dpmodel/descriptor/se_t.py
93-93: Do not use mutable data structures for argument defaults (B006) Replace with `None`; initialize within function --- 98-98: Do not use mutable data structures for argument defaults (B006) Replace with `None`; initialize within function --- 252-252: Do not use mutable data structures for argument defaults (B006) Replace with `None`; initialize within function --- 381-381: Local variable `env_mat` is assigned to but never used (F841) Remove assignment to unused variable `env_mat`source/tests/universal/common/cases/descriptor/utils.py
49-49: Loop control variable `vv` not used within loop body (B007) Rename unused `vv` to `_vv`deepmd/pt/model/descriptor/se_r.py
64-64: Do not use mutable data structures for argument defaults (B006) Replace with `None`; initialize within function --- 69-69: Do not use mutable data structures for argument defaults (B006) Replace with `None`; initialize within function --- 257-261: Use ternary operator `sampled = merged() if callable(merged) else merged` instead of `if`-`else`-block (SIM108) --- 297-297: Do not use mutable data structures for argument defaults (B006) Replace with `None`; initialize within function --- 435-435: Local variable `env_mat` is assigned to but never used (F841) Remove assignment to unused variable `env_mat`deepmd/dpmodel/descriptor/se_e2_a.py
147-147: Do not use mutable data structures for argument defaults (B006) Replace with `None`; initialize within function --- 152-152: Do not use mutable data structures for argument defaults (B006) Replace with `None`; initialize within function --- 325-325: Do not use mutable data structures for argument defaults (B006) Replace with `None`; initialize within function --- 455-455: Local variable `env_mat` is assigned to but never used (F841) Remove assignment to unused variable `env_mat`deepmd/pt/model/descriptor/se_a.py
78-78: Do not use mutable data structures for argument defaults (B006) Replace with `None`; initialize within function --- 84-84: Do not use mutable data structures for argument defaults (B006) Replace with `None`; initialize within function --- 224-224: Do not use mutable data structures for argument defaults (B006) Replace with `None`; initialize within function --- 323-323: Local variable `env_mat` is assigned to but never used (F841) Remove assignment to unused variable `env_mat` --- 376-376: Do not use mutable data structures for argument defaults (B006) Replace with `None`; initialize within function --- 382-382: Do not use mutable data structures for argument defaults (B006) Replace with `None`; initialize within function --- 565-569: Use ternary operator `sampled = merged() if callable(merged) else merged` instead of `if`-`else`-block (SIM108) --- 589-589: Do not use mutable data structures for argument defaults (B006) Replace with `None`; initialize within function --- 639-639: Loop control variable `ii` not used within loop body (B007)deepmd/pt/model/descriptor/dpa1.py
215-215: Do not use mutable data structures for argument defaults (B006) Replace with `None`; initialize within function --- 227-227: Do not use mutable data structures for argument defaults (B006) Replace with `None`; initialize within function --- 517-517: Local variable `env_mat` is assigned to but never used (F841) Remove assignment to unused variable `env_mat` --- 585-585: Local variable `nall` is assigned to but never used (F841) Remove assignment to unused variable `nall`deepmd/pt/model/descriptor/se_t.py
113-113: Do not use mutable data structures for argument defaults (B006) Replace with `None`; initialize within function --- 118-118: Do not use mutable data structures for argument defaults (B006) Replace with `None`; initialize within function --- 253-253: Do not use mutable data structures for argument defaults (B006) Replace with `None`; initialize within function --- 348-348: Local variable `env_mat` is assigned to but never used (F841) Remove assignment to unused variable `env_mat` --- 401-401: Do not use mutable data structures for argument defaults (B006) Replace with `None`; initialize within function --- 406-406: Do not use mutable data structures for argument defaults (B006) Replace with `None`; initialize within function --- 592-596: Use ternary operator `sampled = merged() if callable(merged) else merged` instead of `if`-`else`-block (SIM108) --- 616-616: Do not use mutable data structures for argument defaults (B006) Replace with `None`; initialize within functiondeepmd/main.py
83-83: No explicit `stacklevel` keyword argument found (B028) --- 114-114: Use `key not in dict` instead of `key not in dict.keys()` (SIM118) Remove `.keys()`deepmd/pt/model/descriptor/dpa2.py
84-84: Do not use mutable data structures for argument defaults (B006) Replace with `None`; initialize within function --- 429-429: Loop control variable `ii` not used within loop body (B007) Rename unused `ii` to `_ii` --- 549-549: Local variable `env_mat` is assigned to but never used (F841) Remove assignment to unused variable `env_mat`deepmd/dpmodel/descriptor/dpa2.py
67-67: Do not use mutable data structures for argument defaults (B006) Replace with `None`; initialize within function --- 325-325: Do not use mutable data structures for argument defaults (B006) Replace with `None`; initialize within function --- 804-804: Local variable `env_mat` is assigned to but never used (F841) Remove assignment to unused variable `env_mat`deepmd/dpmodel/descriptor/dpa1.py
226-226: Do not use mutable data structures for argument defaults (B006) Replace with `None`; initialize within function --- 237-237: Do not use mutable data structures for argument defaults (B006) Replace with `None`; initialize within function --- 465-465: Local variable `nall` is assigned to but never used (F841) Remove assignment to unused variable `nall` --- 545-545: Local variable `env_mat` is assigned to but never used (F841) Remove assignment to unused variable `env_mat` --- 607-607: Do not use mutable data structures for argument defaults (B006) Replace with `None`; initialize within function --- 617-617: Do not use mutable data structures for argument defaults (B006) Replace with `None`; initialize within function --- 807-807: Do not use mutable data structures for argument defaults (B006) Replace with `None`; initialize within function
deepmd/dpmodel/descriptor/make_base_descriptor.py
[warning] 84-84: deepmd/dpmodel/descriptor/make_base_descriptor.py#L84 Added line #L84 was not covered by tests --- [warning] 128-128: deepmd/dpmodel/descriptor/make_base_descriptor.py#L128 Added line #L128 was not covered by tests --- [warning] 133-133: deepmd/dpmodel/descriptor/make_base_descriptor.py#L133 Added line #L133 was not covered by tests --- [warning] 138-138: deepmd/dpmodel/descriptor/make_base_descriptor.py#L138 Added line #L138 was not covered by testsdeepmd/dpmodel/descriptor/hybrid.py
[warning] 129-129: deepmd/dpmodel/descriptor/hybrid.py#L129 Added line #L129 was not covered by testsdeepmd/dpmodel/descriptor/se_r.py
[warning] 239-239: deepmd/dpmodel/descriptor/se_r.py#L239 Added line #L239 was not covered by tests --- [warning] 251-251: deepmd/dpmodel/descriptor/se_r.py#L251 Added line #L251 was not covered by tests --- [warning] 263-264: deepmd/dpmodel/descriptor/se_r.py#L263-L264 Added lines #L263 - L264 were not covered by tests --- [warning] 268-268: deepmd/dpmodel/descriptor/se_r.py#L268 Added line #L268 was not covered by testsdeepmd/dpmodel/descriptor/se_t.py
[warning] 177-177: deepmd/dpmodel/descriptor/se_t.py#L177 Added line #L177 was not covered by tests --- [warning] 231-231: deepmd/dpmodel/descriptor/se_t.py#L231 Added line #L231 was not covered by tests --- [warning] 243-244: deepmd/dpmodel/descriptor/se_t.py#L243-L244 Added lines #L243 - L244 were not covered by tests --- [warning] 248-248: deepmd/dpmodel/descriptor/se_t.py#L248 Added line #L248 was not covered by testsdeepmd/dpmodel/descriptor/se_e2_a.py
[warning] 281-281: deepmd/dpmodel/descriptor/se_e2_a.py#L281 Added line #L281 was not covered by tests --- [warning] 293-293: deepmd/dpmodel/descriptor/se_e2_a.py#L293 Added line #L293 was not covered by tests --- [warning] 305-306: deepmd/dpmodel/descriptor/se_e2_a.py#L305-L306 Added lines #L305 - L306 were not covered by tests --- [warning] 310-310: deepmd/dpmodel/descriptor/se_e2_a.py#L310 Added line #L310 was not covered by testsdeepmd/dpmodel/descriptor/dpa2.py
[warning] 512-512: deepmd/dpmodel/descriptor/dpa2.py#L512 Added line #L512 was not covered by testsdeepmd/dpmodel/descriptor/dpa1.py
[warning] 335-335: deepmd/dpmodel/descriptor/dpa1.py#L335 Added line #L335 was not covered by tests
source/tests/universal/common/backend.py (2)
`27-28`: LGTM! The abstract method `convert_to_numpy` is well-defined and encourages consistent implementation across subclasses. --- `32-33`: LGTM! The abstract method `convert_from_numpy` is correctly defined to ensure consistent behavior across subclasses.source/tests/universal/dpmodel/backend.py (2)
`23-24`: LGTM! The method `convert_to_numpy` correctly implements the abstract method by returning the input numpy array. --- `27-28`: LGTM! The method `convert_from_numpy` provides a correct and straightforward implementation of the abstract method.source/tests/universal/pt/backend.py (2)
`37-38`: LGTM! The method `convert_to_numpy` correctly utilizes the utility function `to_numpy_array` to convert PyTorch tensors to numpy arrays. --- `41-42`: LGTM! The method `convert_from_numpy` effectively uses the utility function `to_torch_tensor` for converting numpy arrays to PyTorch tensors.deepmd/utils/finetune.py (4)
`11-65`: LGTM! The `FinetuneRuleItem` class is well-structured and provides clear methods for accessing fine-tuning rules and properties. The documentation is clear and the methods are well-defined. --- `76-111`: LGTM! The function `get_index_between_two_maps` correctly calculates the mapping index and handles new types appropriately, including logging a warning when new types are detected. --- `114-136`: LGTM! The function `map_atom_exclude_types` correctly remaps atom exclude types based on the provided index map. The implementation is straightforward and effective. --- `139-164`: LGTM! The function `map_pair_exclude_types` correctly remaps pair exclude types based on the provided index map. The implementation is straightforward and effective.deepmd/dpmodel/descriptor/se_t.py (1)
Line range hint `351-368`: The serialization method is updated to include `type_map` and `trainable` in the serialized data, aligning with the changes in the descriptor's properties. This is crucial for ensuring consistency in serialized and deserialized objects.deepmd/pt/model/descriptor/se_r.py (1)
`392-399`: Serialization methods look well-implemented. The methods `serialize`, `deserialize`, and related statistical methods are correctly implemented and align with the class's functionality. Also applies to: 404-422, 432-432deepmd/dpmodel/descriptor/se_e2_a.py (1)
`299-311`: Serialization methods look well-implemented. The methods `serialize`, `deserialize`, and related statistical methods are correctly implemented and align with the class's functionality. Also applies to: 421-443, 450-450Tools
GitHub Check: codecov/patch
[warning] 305-306: deepmd/dpmodel/descriptor/se_e2_a.py#L305-L306 Added lines #L305 - L306 were not covered by tests --- [warning] 310-310: deepmd/dpmodel/descriptor/se_e2_a.py#L310 Added line #L310 was not covered by testsdeepmd/pt/model/descriptor/se_a.py (3)
`138-141`: The implementation of `get_type_map` method is straightforward and aligns with the PR's objective to handle type maps correctly. --- `276-276`: The methods `set_stat_mean_and_stddev` and `get_stat_mean_and_stddev` are well-implemented and provide clear functionality for managing statistics mean and standard deviation. Also applies to: 280-283 --- `308-308`: The serialization of `type_map` within the `serialize` method ensures that the type map is preserved, which is crucial for maintaining consistency across different model states.deepmd/pt/model/descriptor/dpa1.py (1)
`430-456`: Review the implementation of `change_type_map`. The method `change_type_map` is critical for handling type map changes. It appears to correctly update the type map and related statistics. However, ensure that the method handles all edge cases, especially when `model_with_new_type_stat` is `None` and new types are present.
Attention: Patch coverage is 87.50000%
with 76 lines
in your changes missing coverage. Please review.
Project coverage is 82.70%. Comparing base (
a7ab1af
) to head (fd64ee5
).
:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.
(I realize that the following behavior is all expected, and you can ignore this commit....)
There is an obvious problem with this design, when I don't use --use-pretrain-script
, if the model parameters in the input.json
I provide is not the same as the model parameters of the pre-trained model, then there must be a failure to load the parameter during self.wrapper.load_state_dict(state_dict)
. Because the network parameters of self.wrapper
are initialized based on the model parameters of input.json
, while state_dict
is the network parameters of the pre-trained model.
For example, if sel = 120
in pretrained model while sel = 80
in input.json
:
Singletask finetuning from a single model:
dp --pt train finetune_single.json --finetune single.pt
RuntimeError: Error(s) in loading state_dict for ModelWrapper: size mismatch for model.Default.atomic_model.descriptor.repinit.mean: copying a param with shape torch.Size([3, 120, 4]) from checkpoint, the shape in current model is torch.Size([3, 80, 4]).
Singletask finetuning from a multitask model:
dp --pt train finetune_single.json --finetune multi.pt
RuntimeError: Error(s) in loading state_dict for ModelWrapper: size mismatch for model.Default.atomic_model.descriptor.repinit.mean: copying a param with shape torch.Size([3, 120, 4]) from checkpoint, the shape in current model is torch.Size([3, 80, 4]).
I will review the PR after we discuss it online @iProzd @njzjz
Fix #3747. Fix #3455.
Consistent fine-tuning with init-model, now in pt, fine-tuning include three steps:
By default, input will use user input while fine-tuning, instead of being overwritten by that in the pre-trained model. When adding “--use-pretrain-script”, user can use that in the pre-trained model.
Now
type_map
will use that in the user input instead of overwritten by that in the pre-trained model.Note:
Summary by CodeRabbit
New Features
Documentation
--use-pretrain-script
option for fine-tuning.Refactor
Tests