I was receiving an error due to the lack of a max defined for np.clip().
Traceback (most recent call last):
File "main.py", line 226, in <module>
run(model, baal_trainer, baal_data_module, config, experiment_name)
File "main.py", line 212, in run
should_continue = baal_trainer.step(model, datamodule=baal_data_module)
File "/workspace/environments/active-learning/lib/python3.8/site-packages/baal/utils/pytorch_lightning.py", line 202, in step
to_label = self.heuristic(probs)
File "/workspace/environments/active-learning/lib/python3.8/site-packages/baal/active/heuristics/heuristics.py", line 258, in __call__
return self.get_ranks(predictions)[0]
File "/workspace/environments/active-learning/lib/python3.8/site-packages/baal/active/heuristics/stochastics.py", line 45, in get_ranks
distributions = np.clip(distributions, 0)
File "<__array_function__ internals>", line 4, in clip
TypeError: _clip_dispatcher() missing 1 required positional argument: 'a_max'
Features:
Max defined as None for np.clip
Checklist:
[ ] Your code is documented (To validate this, add your module to tests/documentation_test.py).
[ ] Your code is tested with unit tests.
[ ] You moved your Issue to the PR state.
NB: sorry this is a comically small PR but just wanted to go through the motions with one. Will try to address the string mapping for PowerSampling etc. on Sunday but no promises.
Added None to act as max for np.clip
Summary:
I was receiving an error due to the lack of a max defined for np.clip().
Features:
Max defined as None for np.clip
Checklist:
tests/documentation_test.py
).NB: sorry this is a comically small PR but just wanted to go through the motions with one. Will try to address the string mapping for PowerSampling etc. on Sunday but no promises.
I haven't yet run the tests, but have used it.