scikit-learn-contrib / metric-learn

Metric learning algorithms in Python
http://contrib.scikit-learn.org/metric-learn/
MIT License
1.38k stars 231 forks source link

[Tests][Warnings] Cut all warnings from SCML using a minimal solution #341

Closed mvargas33 closed 2 years ago

mvargas33 commented 2 years ago

As suggested by @bellet in #339, it's better to cut SCML warnings from the root by specifying the n_basis value when its possible.

As all test run the iris dataset for SCML though build_triplets at test_utils.py, then we just need to scpecify n_basis for SCML and SCML_Supervised in the lists.

Going through the code of SCML, if n_basis is None, then the following default is assigned:

Default for SCML_Supervised: (lines 579-583)

n_basis = min(20*n_features, n_samples*2*min(n_classes-1, n_features) - 1)

Default for SCML: (line 234)

n_basis = n_features*80

Iris dataset has 150 samples, 3 classes, 4 features. So the value is 80 for SCML_Supervised and 320 for SCML.

I changed it, but got the following tests failing.

Besides this observations, all warnings from SCML were removed. Only 300-ish warnings are shown now across all tests.

Also, this PR removes the isinstance() overkill made in test_triplets_clasifiers.py previously.

Best! 🎃

bellet commented 2 years ago

Thanks a lot, @mvargas33!