python-visualization / branca

This library is a spinoff from folium, that would host the non-map-specific features.
https://python-visualization.github.io/branca/
MIT License
118 stars 66 forks source link

test_color_brewer_extendability is very very slow #174

Open kloczek opened 5 months ago

kloczek commented 5 months ago

I'm packaging your module as an rpm package so I'm using the typical PEP517 based build, install and test cycle used on building packages from non-root account.

Here is pytest output: ```console + PYTHONPATH=/home/tkloczko/rpmbuild/BUILDROOT/python-branca-0.8.0-2.fc37.x86_64/usr/lib64/python3.10/site-packages:/home/tkloczko/rpmbuild/BUILDROOT/python-branca-0.8.0-2.fc37.x86_64/usr/lib/python3.10/site-packages + /usr/bin/pytest -ra -m 'not network' -v ============================= test session starts ============================== platform linux -- Python 3.10.14, pytest-8.2.2, pluggy-1.5.0 -- /usr/bin/python3 cachedir: .pytest_cache rootdir: /home/tkloczko/rpmbuild/BUILD/branca-0.8.0 configfile: setup.cfg collecting ... collected 37 items tests/test_colormap.py::test_simple_step PASSED [ 2%] tests/test_colormap.py::test_simple_linear PASSED [ 5%] tests/test_colormap.py::test_step_color_indexing PASSED [ 8%] tests/test_colormap.py::test_step_color_indexing_larger_index PASSED [ 10%] tests/test_colormap.py::test_linear_color_indexing PASSED [ 13%] tests/test_colormap.py::test_linear_to_step PASSED [ 16%] tests/test_colormap.py::test_step_to_linear PASSED [ 18%] tests/test_colormap.py::test_linear_object PASSED [ 21%] tests/test_colormap.py::test_step_object PASSED [ 24%] tests/test_colormap.py::test_max_labels_linear[10-expected0] PASSED [ 27%] tests/test_colormap.py::test_max_labels_linear[5-expected1] PASSED [ 29%] tests/test_colormap.py::test_max_labels_linear[3-expected2] PASSED [ 32%] tests/test_colormap.py::test_max_labels_step[10-expected0] PASSED [ 35%] tests/test_colormap.py::test_max_labels_step[5-expected1] PASSED [ 37%] tests/test_colormap.py::test_max_labels_step[3-expected2] PASSED [ 40%] tests/test_iframe.py::test_create_empty_iframe PASSED [ 43%] tests/test_iframe.py::test_create_iframe PASSED [ 45%] tests/test_iframe.py::test_rendering_utf8_iframe FAILED [ 48%] tests/test_iframe.py::test_rendering_figure_notebook FAILED [ 51%] tests/test_utilities.py::test_color_brewer_base PASSED [ 54%] tests/test_utilities.py::test_color_brewer_reverse PASSED [ 56%] tests/test_utilities.py::test_color_brewer_extendability ```
List of installed modules in build env: ```console Package Version ----------------------------- ----------- alabaster 0.7.16 attrs 23.2.0 Babel 2.15.0 build 1.2.1 certifi 2023.7.22 charset-normalizer 3.3.2 defusedxml 0.7.1 distro 1.9.0 docutils 0.20.1 exceptiongroup 1.1.3 h11 0.14.0 imagesize 1.4.1 importlib_metadata 7.1.0 iniconfig 2.0.0 installer 0.7.0 Jinja2 3.1.4 MarkupSafe 2.1.5 numpy 1.26.4 outcome 1.2.0 packaging 24.0 pluggy 1.5.0 Pygments 2.18.0 pyproject_hooks 1.0.0 pytest 8.2.2 python-dateutil 2.9.0.post0 requests 2.32.3 selenium 4.21.0 setuptools 69.4.0 setuptools-scm 8.1.0 sniffio 1.3.0 snowballstemmer 2.2.0 sortedcontainers 2.4.0 Sphinx 7.3.7 sphinxcontrib-applehelp 1.0.8 sphinxcontrib-devhelp 1.0.6 sphinxcontrib-htmlhelp 2.0.5 sphinxcontrib-jsmath 1.0.1 sphinxcontrib-qthelp 1.0.7 sphinxcontrib-serializinghtml 1.1.10 tokenize_rt 5.2.0 tomli 2.0.1 trio 0.25.1 trio-websocket 0.11.1 typing_extensions 4.12.2 urllib3 2.2.1 wheel 0.43.0 wsproto 1.2.0 zipp 3.19.2 ```

Please let me know if you need more details or want me to perform some diagnostics.

kloczek commented 5 months ago

pytest output with warnings after --deselect units which requires running firefox and that nit which freezes

===================================================================================== warnings summary ======================================================================================
../../../../../usr/lib/python3.10/site-packages/_pytest/config/__init__.py:1447
  /usr/lib/python3.10/site-packages/_pytest/config/__init__.py:1447: PytestConfigWarning: Unknown config option: flake8-ignore

    self._warn_or_fail_if_strict(f"Unknown config option: {key}\n")

../../../../../usr/lib/python3.10/site-packages/_pytest/config/__init__.py:1447
  /usr/lib/python3.10/site-packages/_pytest/config/__init__.py:1447: PytestConfigWarning: Unknown config option: flake8-max-line-length

    self._warn_or_fail_if_strict(f"Unknown config option: {key}\n")

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
======================================================================= 34 passed, 3 deselected, 2 warnings in 1.95s ========================================================================
Conengmo commented 5 months ago

I think I see the same issue locally without any extra Pytest arguments... not sure what is going on here, and why the CI/CD didn't catch it... Help figuring this out is welcome!

Conengmo commented 5 months ago

Looks like the test runs but is just very slow, a nested for loop with 39 items and ~253 item in the second loop...

Conengmo commented 5 months ago

That's probably it, it's a very slow test! Looking at the latest Github Action runs, tests should take between 4 and 10 minutes total. Can you check if your tests finish within that time?

Conengmo commented 5 months ago

That particular test test_color_brewer_extendability passes after ~4 minutes for me. We should look if that can't be done more efficiently, I'm sure it can. But for now I assume nothing is really broken. Can you let me know if this solves the issue for you?

kloczek commented 5 months ago

BTW those two units which requires firefox. Do you have any recommendation how to pass those units in batch mode? 🤔

Indeed tests/test_utilities.py::test_color_brewer_extendability is ok. On my build system it took about 20 min to finish that single unit.

Conengmo commented 5 months ago

That’s not acceptable, we should improve that.

I’m not sure at the moment about those browser tests how to do those in batch mode…

kloczek commented 5 months ago

Usually pytest works in batch mode. I think that those units would be better to mark using disables by default pytest mark.

Conengmo commented 5 months ago

I profiled the test case and what's making it particularly slow is the _scale nested function inside the linear_gradient function. But rather than rewriting that in a more efficient way, let's first see if we can't reduce the number of test cases this test goes through.