Closed beew closed 4 years ago
Hi @beew, thanks for the report! Could you please retry this on a completely fresh environment? It looks to me like you are missing the compiled components.
To help me diagnose, if you are using conda
, please create and activate a fresh environment via
conda create --name gtda_tests python=3.7.8 -y && conda activate gtda_tests
then please clone the repo again in a new directory, cd
to that directory and again run python -m pip install -e ".[dev]"
But I am not using conda, I have python installed in a local folder and use a script to invoke it, works fine for me. conda just pulls in too much junks and I have to reinstall a lot of duplicate packages. I think a git pull and git submodule sync git submodule update --init --recursive should be able to bring in all the components, no?
Thanks for the reply @beew. You could also use virtualenv
to create a virtual environment if you prefer to avoid conda
. If you prefer not to use environments at all, and to avoid worrying too much about submodules I suggest that you:
giotto-tda
;Hi,
Just tried again, turned out I was building with the wrong command "python setup.py build python setup.py install" so this time I did python m pip install -e ".[dev]"
I still get errors but different ones
ytest gtda
============================= test session starts ==============================
platform linux -- Python 3.7.8, pytest-6.0.0, py-1.9.0, pluggy-0.13.1
benchmark: 3.2.3 (defaults: timer=time.perf_counter disable_gc=False min_rounds=5 min_time=0.000005 max_time=1.0 calibration_precision=10 warmup=False warmup_iterations=100000)
rootdir: /home/bernard/opt/python37/giotto-tda, configfile: setup.cfg
plugins: azurepipelines-0.8.0, benchmark-3.2.3, hypothesis-5.23.3, cov-2.10.1
collected 652 items / 5 errors / 647 selected
##vso[task.logissue type=warning;]Using or importing the ABCs from 'collections' instead of from 'collections.abc' is deprecated since Python 3.3,and in 3.9 it will stop working
##vso[task.logissue type=warning;]Using or importing the ABCs from 'collections' instead of from 'collections.abc' is deprecated since Python 3.3,and in 3.9 it will stop working
##vso[task.logissue type=warning;]Passing a class is deprecated since version 0.23 and won't be supported in 0.24.Please pass an instance instead.
##vso[task.logissue type=warning;]Passing a class is deprecated since version 0.23 and won't be supported in 0.24.Please pass an instance instead.
##vso[results.publish type=JUnit;runTitle='Pytest results';]/home/bernard/opt/python37/giotto-tda/test-output.xml
##vso[task.logissue type=error;]5 test(s) failed, 652 test(s) collected.
##vso[task.logissue type=warning;]Coverage XML was not created, skipping upload.
==================================== ERRORS ====================================
______________ ERROR collecting gtda/mapper/tests/test_cluster.py ______________
../lib/python3.7/importlib/__init__.py:127: in import_module
return _bootstrap._gcd_import(name[level:], package, level)
<frozen importlib._bootstrap>:1006: in _gcd_import
???
<frozen importlib._bootstrap>:983: in _find_and_load
???
<frozen importlib._bootstrap>:953: in _find_and_load_unlocked
???
<frozen importlib._bootstrap>:219: in _call_with_frames_removed
???
<frozen importlib._bootstrap>:1006: in _gcd_import
???
<frozen importlib._bootstrap>:983: in _find_and_load
???
<frozen importlib._bootstrap>:953: in _find_and_load_unlocked
???
<frozen importlib._bootstrap>:219: in _call_with_frames_removed
???
<frozen importlib._bootstrap>:1006: in _gcd_import
???
<frozen importlib._bootstrap>:983: in _find_and_load
???
<frozen importlib._bootstrap>:967: in _find_and_load_unlocked
???
<frozen importlib._bootstrap>:677: in _load_unlocked
???
<frozen importlib._bootstrap_external>:728: in exec_module
???
<frozen importlib._bootstrap>:219: in _call_with_frames_removed
???
gtda/mapper/__init__.py:7: in <module>
from .nerve import Nerve
gtda/mapper/nerve.py:8: in <module>
import igraph as ig
../lib/python3.7/site-packages/igraph/__init__.py:8: in <module>
raise DeprecationWarning("To avoid name collision with the igraph project, "
E DeprecationWarning: To avoid name collision with the igraph project, this visualization library has been renamed to 'jgraph'. Please upgrade when convenient.
_______________ ERROR collecting gtda/mapper/tests/test_cover.py _______________
../lib/python3.7/importlib/__init__.py:127: in import_module
return _bootstrap._gcd_import(name[level:], package, level)
<frozen importlib._bootstrap>:1006: in _gcd_import
???
<frozen importlib._bootstrap>:983: in _find_and_load
???
<frozen importlib._bootstrap>:953: in _find_and_load_unlocked
???
<frozen importlib._bootstrap>:219: in _call_with_frames_removed
???
<frozen importlib._bootstrap>:1006: in _gcd_import
???
<frozen importlib._bootstrap>:983: in _find_and_load
???
<frozen importlib._bootstrap>:953: in _find_and_load_unlocked
???
<frozen importlib._bootstrap>:219: in _call_with_frames_removed
???
<frozen importlib._bootstrap>:1006: in _gcd_import
???
<frozen importlib._bootstrap>:983: in _find_and_load
???
<frozen importlib._bootstrap>:967: in _find_and_load_unlocked
???
<frozen importlib._bootstrap>:677: in _load_unlocked
???
<frozen importlib._bootstrap_external>:728: in exec_module
???
<frozen importlib._bootstrap>:219: in _call_with_frames_removed
???
gtda/mapper/__init__.py:7: in <module>
from .nerve import Nerve
gtda/mapper/nerve.py:8: in <module>
import igraph as ig
../lib/python3.7/site-packages/igraph/__init__.py:8: in <module>
raise DeprecationWarning("To avoid name collision with the igraph project, "
E DeprecationWarning: To avoid name collision with the igraph project, this visualization library has been renamed to 'jgraph'. Please upgrade when convenient.
______________ ERROR collecting gtda/mapper/tests/test_filter.py _______________
../lib/python3.7/importlib/__init__.py:127: in import_module
return _bootstrap._gcd_import(name[level:], package, level)
<frozen importlib._bootstrap>:1006: in _gcd_import
???
<frozen importlib._bootstrap>:983: in _find_and_load
???
<frozen importlib._bootstrap>:953: in _find_and_load_unlocked
???
<frozen importlib._bootstrap>:219: in _call_with_frames_removed
???
<frozen importlib._bootstrap>:1006: in _gcd_import
???
<frozen importlib._bootstrap>:983: in _find_and_load
???
<frozen importlib._bootstrap>:953: in _find_and_load_unlocked
???
<frozen importlib._bootstrap>:219: in _call_with_frames_removed
???
<frozen importlib._bootstrap>:1006: in _gcd_import
???
<frozen importlib._bootstrap>:983: in _find_and_load
???
<frozen importlib._bootstrap>:967: in _find_and_load_unlocked
???
<frozen importlib._bootstrap>:677: in _load_unlocked
???
<frozen importlib._bootstrap_external>:728: in exec_module
???
<frozen importlib._bootstrap>:219: in _call_with_frames_removed
???
gtda/mapper/__init__.py:7: in <module>
from .nerve import Nerve
gtda/mapper/nerve.py:8: in <module>
import igraph as ig
../lib/python3.7/site-packages/igraph/__init__.py:8: in <module>
raise DeprecationWarning("To avoid name collision with the igraph project, "
E DeprecationWarning: To avoid name collision with the igraph project, this visualization library has been renamed to 'jgraph'. Please upgrade when convenient.
_______________ ERROR collecting gtda/mapper/tests/test_nerve.py _______________
../lib/python3.7/importlib/__init__.py:127: in import_module
return _bootstrap._gcd_import(name[level:], package, level)
<frozen importlib._bootstrap>:1006: in _gcd_import
???
<frozen importlib._bootstrap>:983: in _find_and_load
???
<frozen importlib._bootstrap>:953: in _find_and_load_unlocked
???
<frozen importlib._bootstrap>:219: in _call_with_frames_removed
???
<frozen importlib._bootstrap>:1006: in _gcd_import
???
<frozen importlib._bootstrap>:983: in _find_and_load
???
<frozen importlib._bootstrap>:953: in _find_and_load_unlocked
???
<frozen importlib._bootstrap>:219: in _call_with_frames_removed
???
<frozen importlib._bootstrap>:1006: in _gcd_import
???
<frozen importlib._bootstrap>:983: in _find_and_load
???
<frozen importlib._bootstrap>:967: in _find_and_load_unlocked
???
<frozen importlib._bootstrap>:677: in _load_unlocked
???
<frozen importlib._bootstrap_external>:728: in exec_module
???
<frozen importlib._bootstrap>:219: in _call_with_frames_removed
???
gtda/mapper/__init__.py:7: in <module>
from .nerve import Nerve
gtda/mapper/nerve.py:8: in <module>
import igraph as ig
../lib/python3.7/site-packages/igraph/__init__.py:8: in <module>
raise DeprecationWarning("To avoid name collision with the igraph project, "
E DeprecationWarning: To avoid name collision with the igraph project, this visualization library has been renamed to 'jgraph'. Please upgrade when convenient.
___________ ERROR collecting gtda/mapper/tests/test_visualization.py ___________
../lib/python3.7/importlib/__init__.py:127: in import_module
return _bootstrap._gcd_import(name[level:], package, level)
<frozen importlib._bootstrap>:1006: in _gcd_import
???
<frozen importlib._bootstrap>:983: in _find_and_load
???
<frozen importlib._bootstrap>:953: in _find_and_load_unlocked
???
<frozen importlib._bootstrap>:219: in _call_with_frames_removed
???
<frozen importlib._bootstrap>:1006: in _gcd_import
???
<frozen importlib._bootstrap>:983: in _find_and_load
???
<frozen importlib._bootstrap>:953: in _find_and_load_unlocked
???
<frozen importlib._bootstrap>:219: in _call_with_frames_removed
???
<frozen importlib._bootstrap>:1006: in _gcd_import
???
<frozen importlib._bootstrap>:983: in _find_and_load
???
<frozen importlib._bootstrap>:967: in _find_and_load_unlocked
???
<frozen importlib._bootstrap>:677: in _load_unlocked
???
<frozen importlib._bootstrap_external>:728: in exec_module
???
<frozen importlib._bootstrap>:219: in _call_with_frames_removed
???
gtda/mapper/__init__.py:7: in <module>
from .nerve import Nerve
gtda/mapper/nerve.py:8: in <module>
import igraph as ig
../lib/python3.7/site-packages/igraph/__init__.py:8: in <module>
raise DeprecationWarning("To avoid name collision with the igraph project, "
E DeprecationWarning: To avoid name collision with the igraph project, this visualization library has been renamed to 'jgraph'. Please upgrade when convenient.
=============================== warnings summary ===============================
/home/bernard/opt/python37/lib/python3.7/site-packages/jsonschema/compat.py:6
/home/bernard/opt/python37/lib/python3.7/site-packages/jsonschema/compat.py:6
/home/bernard/opt/python37/lib/python3.7/site-packages/jsonschema/compat.py:6: DeprecationWarning:
Using or importing the ABCs from 'collections' instead of from 'collections.abc' is deprecated since Python 3.3,and in 3.9 it will stop working
/home/bernard/opt/python37/lib/python3.7/site-packages/sklearn/utils/estimator_checks.py:420
/home/bernard/opt/python37/lib/python3.7/site-packages/sklearn/utils/estimator_checks.py:420: FutureWarning:
Passing a class is deprecated since version 0.23 and won't be supported in 0.24.Please pass an instance instead.
/home/bernard/opt/python37/lib/python3.7/site-packages/sklearn/utils/estimator_checks.py:488
/home/bernard/opt/python37/lib/python3.7/site-packages/sklearn/utils/estimator_checks.py:488: FutureWarning:
Passing a class is deprecated since version 0.23 and won't be supported in 0.24.Please pass an instance instead.
-- Docs: https://docs.pytest.org/en/stable/warnings.html
-- generated xml file: /home/bernard/opt/python37/giotto-tda/test-output.xml ---
=========================== short test summary info ============================
ERROR gtda/mapper/tests/test_cluster.py - DeprecationWarning: To avoid name c...
ERROR gtda/mapper/tests/test_cover.py - DeprecationWarning: To avoid name col...
ERROR gtda/mapper/tests/test_filter.py - DeprecationWarning: To avoid name co...
ERROR gtda/mapper/tests/test_nerve.py - DeprecationWarning: To avoid name col...
ERROR gtda/mapper/tests/test_visualization.py - DeprecationWarning: To avoid ...
!!!!!!!!!!!!!!!!!!! Interrupted: 5 errors during collection !!!!!!!!!!!!!!!!!!!!
======================== 4 warnings, 5 errors in 1.84s =========================
You seem to have a visualization library called igraph
installed, instead of or as well as python-igraph
. So all your igraph
imports in the gtda
code are failing.
E DeprecationWarning: To avoid name collision with the igraph project, this visualization library has been renamed to 'jgraph'. Please upgrade when convenient.
This suggests you should run python -m pip install --upgrade igraph
, and then check python-igraph
is installed: python -m pip install python-igraph
.
Unless, of course, you don't care about the visualization library formerly called igraph
and now called jgraph
(which is irrelevant to giotto-tda
), then you could just uninstall it.
Hi It works (mostly) now
There is still one failed test
=================================== FAILURES ===================================
____________________ test_hk_pi_big_sigma[PersistenceImage] ____________________
transformer_cls = <class 'gtda.diagrams.representations.PersistenceImage'>
@pytest.mark.parametrize('transformer_cls', [HeatKernel, PersistenceImage])
> @given(pts_gen, dims_gen)
def test_hk_pi_big_sigma(transformer_cls, pts, dims):
"""We expect that with a huge sigma, the diagrams are so diluted that they
E hypothesis.errors.MultipleFailures: Hypothesis found 2 distinct failures.
gtda/diagrams/tests/test_features_representations.py:253: MultipleFailures
---------------------------------- Hypothesis ----------------------------------
Falsifying example: test_hk_pi_big_sigma(
transformer_cls=gtda.diagrams.representations.PersistenceImage,
pts=array([[[1., 1.],
[1., 1.],
[1., 1.],
[1., 1.],
[1., 1.],
[1., 1.],
[1., 1.],
[1., 1.],
[1., 1.],
[1., 1.],
[1., 1.],
[1., 1.],
[1., 1.],
[1., 1.],
[1., 1.],
[1., 1.],
[1., 1.],
[1., 1.],
[1., 1.],
[1., 1.]]]),
dims=array([[[0],
[0],
[0],
[0],
[0],
[0],
[0],
[0],
[0],
[0],
[0],
[0],
[0],
[0],
[0],
[0],
[0],
[0],
[0],
[0]]]),
)
Traceback (most recent call last):
File "/home/bernard/opt/python37/giotto-tda/gtda/diagrams/tests/test_features_representations.py", line 267, in test_hk_pi_big_sigma
assert max_hk_abs_value <= 1e-4
AssertionError: assert 4.982160818421944e+22 <= 0.0001
Falsifying example: test_hk_pi_big_sigma(
transformer_cls=gtda.diagrams.representations.PersistenceImage,
pts=array([[[1., 1.],
[1., 1.],
[1., 1.],
[1., 1.],
[1., 1.],
[1., 1.],
[1., 1.],
[1., 1.],
[1., 1.],
[1., 1.],
[1., 1.],
[1., 1.],
[1., 1.],
[1., 1.],
[1., 1.],
[1., 1.],
[1., 1.],
[1., 1.],
[1., 1.],
[1., 1.]]]),
dims=array([[[0],
[0],
[0],
[0],
[0],
[0],
[0],
[0],
[0],
[0],
[0],
[0],
[0],
[0],
[0],
[0],
[0],
[0],
[0],
[0]]]),
)
Traceback (most recent call last):
File "/home/bernard/opt/python37/giotto-tda/gtda/diagrams/tests/test_features_representations.py", line 264, in test_hk_pi_big_sigma
X_t = hk.fit_transform(X)
File "/home/bernard/opt/python37/giotto-tda/gtda/utils/_docs.py", line 106, in fit_transform_wrapper
return original_fit_transform(*args, **kwargs)
File "/home/bernard/opt/python37/lib/python3.7/site-packages/sklearn/base.py", line 690, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/home/bernard/opt/python37/giotto-tda/gtda/diagrams/representations.py", line 915, in transform
for dim in self.homology_dimensions_
File "/home/bernard/opt/python37/lib/python3.7/site-packages/joblib/parallel.py", line 1029, in __call__
if self.dispatch_one_batch(iterator):
File "/home/bernard/opt/python37/lib/python3.7/site-packages/joblib/parallel.py", line 847, in dispatch_one_batch
self._dispatch(tasks)
File "/home/bernard/opt/python37/lib/python3.7/site-packages/joblib/parallel.py", line 765, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/home/bernard/opt/python37/lib/python3.7/site-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/home/bernard/opt/python37/lib/python3.7/site-packages/joblib/_parallel_backends.py", line 572, in __init__
self.results = batch()
File "/home/bernard/opt/python37/lib/python3.7/site-packages/joblib/parallel.py", line 253, in __call__
for func, args, kwargs in self.items]
File "/home/bernard/opt/python37/lib/python3.7/site-packages/joblib/parallel.py", line 253, in <listcomp>
for func, args, kwargs in self.items]
File "/home/bernard/opt/python37/giotto-tda/gtda/diagrams/_metrics.py", line 148, in persistence_images
_sample_image(image, diagram_nontrivial_pixel_coords)
File "/home/bernard/opt/python37/giotto-tda/gtda/diagrams/_utils.py", line 60, in _sample_image
image[unique] = counts
IndexError: index -9223372036854775808 is out of bounds for axis 1 with size 10
=============================== warnings summary ===============================
/home/bernard/opt/python37/lib/python3.7/site-packages/jsonschema/compat.py:6
/home/bernard/opt/python37/lib/python3.7/site-packages/jsonschema/compat.py:6
/home/bernard/opt/python37/lib/python3.7/site-packages/jsonschema/compat.py:6: DeprecationWarning:
Using or importing the ABCs from 'collections' instead of from 'collections.abc' is deprecated since Python 3.3,and in 3.9 it will stop working
/home/bernard/opt/python37/lib/python3.7/site-packages/sklearn/utils/estimator_checks.py:420
/home/bernard/opt/python37/lib/python3.7/site-packages/sklearn/utils/estimator_checks.py:420: FutureWarning:
Passing a class is deprecated since version 0.23 and won't be supported in 0.24.Please pass an instance instead.
/home/bernard/opt/python37/lib/python3.7/site-packages/sklearn/utils/estimator_checks.py:488
/home/bernard/opt/python37/lib/python3.7/site-packages/sklearn/utils/estimator_checks.py:488: FutureWarning:
Passing a class is deprecated since version 0.23 and won't be supported in 0.24.Please pass an instance instead.
gtda/diagrams/tests/test_features_representations.py::test_hk_pi_big_sigma[PersistenceImage]
/home/bernard/opt/python37/giotto-tda/gtda/diagrams/_metrics.py:139: RuntimeWarning:
divide by zero encountered in double_scalars
gtda/diagrams/tests/test_features_representations.py::test_hk_pi_big_sigma[PersistenceImage]
/home/bernard/opt/python37/giotto-tda/gtda/diagrams/_metrics.py:145: RuntimeWarning:
invalid value encountered in true_divide
gtda/tests/test_common.py::test_sklearn_api[Binarizer()-check_n_features_in]
gtda/tests/test_common.py::test_sklearn_api[Inverter()-check_n_features_in]
/home/bernard/opt/python37/lib/python3.7/site-packages/sklearn/utils/estimator_checks.py:3014: FutureWarning:
As of scikit-learn 0.23, estimators should expose a n_features_in_ attribute, unless the 'no_validation' tag is True. This attribute should be equal to the number of features passed to the fit method. An error will be raised from version 0.25 when calling check_estimator(). See SLEP010: https://scikit-learn-enhancement-proposals.readthedocs.io/en/latest/slep010/proposal.html
-- Docs: https://docs.pytest.org/en/stable/warnings.html
-- generated xml file: /home/bernard/opt/python37/giotto-tda/test-output.xml ---
=========================== short test summary info ============================
XFAIL gtda/tests/test_common.py::test_sklearn_api[Binarizer()-check_transformer_general]
known failure
XFAIL gtda/tests/test_common.py::test_sklearn_api[Binarizer()-check_transformer_general(readonly_memmap=True)]
known failure
XFAIL gtda/tests/test_common.py::test_sklearn_api[Inverter()-check_transformer_general]
known failure
XFAIL gtda/tests/test_common.py::test_sklearn_api[Inverter()-check_transformer_general(readonly_memmap=True)]
known failure
XPASS gtda/tests/test_common.py::test_sklearn_api[Binarizer()-check_transformer_data_not_an_array] known failure
XPASS gtda/tests/test_common.py::test_sklearn_api[Inverter()-check_transformer_data_not_an_array] known failure
FAILED gtda/diagrams/tests/test_features_representations.py::test_hk_pi_big_sigma[PersistenceImage]
= 1 failed, 691 passed, 4 xfailed, 2 xpassed, 8 warnings in
I am wondering if numpy >= 1.19.1 is necessary because of this https://github.com/numpy/numpy/issues/16769 The test is done with numpy binary installed with pip (not compiled against MKL)
Thanks for the help.
@beew, I'm glad things are basically OK now. The problem you are observing is numerical, it's due to the fact that hypothesis
produces for us a huge amount of examples, many quite pathological (it helps us find corner failures and fix them!). I think in your case it is producing particularly large values that numpy
is not happy to handle. It might even not happen on a second run (hypothesis
produces examples stochastically). We don't observe such issues in our CI on manylinux2010
docker images with Python 3.7 installed, and numpy 1.19.1, although I was observing them for a while while working on #454 and had to tune the test parameters a bit to avoid them in our CI and in my machines. What version of numpy
do you have installed? I am not familiar with the issue you linked, thanks a lot for flagging it.
I just tested with numpy 1.18.5 (by editing requirement.txt) 57tests failed all because of "ValueError: cannot reshape array of size 0 into shape (0,newaxis)"
The issue with numpy 1,19.1 is not critical, it failed a test when compiled against MKL (and it failed three instead of one test for gtda for stock numpy installed with pip)
g-tda installed with pip only need numpy >= 1.17.0. Is there anyway to test a pip install?
@beew I'm not sure I fully follow. numpy
is now required at at least 1.19.1 on master
(https://github.com/giotto-ai/giotto-tda/blob/master/requirements.txt), and indeed there are no guarantees with lower versions such as 1.18.5. What version do you have installed when you test the dev install of gtda
? Does the error appear on subsequent runs, or is it sporadic?
g-tda installed with pip only need numpy >= 1.17.0. Is there anyway to test a pip install?
I don't understand. Do you mean giotto-tda
installed from PyPI (with pip
)? Version 0.2.2 is now quite far behind master
, we are preparing a large release of 0.3.0 in the next couple of weeks.
@ulupo All tests above were done with numpy 1.19.1 on master.
BTW I rerun the test again and this time I ran out of memory
________________________________________________________________________________________ test_large_hk_shape_multithreaded _________________________________________________________________________________________
def test_large_hk_shape_multithreaded():
"""Test that HeatKernel returns something of the right shape when the input
array is at least 1MB and more than 1 process is used, triggering joblib's
use of memmaps"""
X = np.linspace(0, 100, 300000)
n_bins = 10
diagrams = np.expand_dims(
np.stack([X, X, np.zeros(len(X))]).transpose(), axis=0
)
hk = HeatKernel(sigma=1, n_bins=n_bins, n_jobs=2)
num_dimensions = 1
> x_t = hk.fit_transform(diagrams)
gtda/diagrams/tests/test_features_representations.py:301:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
gtda/utils/_docs.py:106: in fit_transform_wrapper
return original_fit_transform(*args, **kwargs)
../lib/python3.7/site-packages/sklearn/base.py:690: in fit_transform
return self.fit(X, **fit_params).transform(X)
gtda/diagrams/representations.py:664: in transform
for dim in self.homology_dimensions_
../lib/python3.7/site-packages/joblib/parallel.py:954: in __call__
n_jobs = self._initialize_backend()
../lib/python3.7/site-packages/joblib/parallel.py:722: in _initialize_backend
**self._backend_args)
../lib/python3.7/site-packages/joblib/_parallel_backends.py:497: in configure
context_id=parallel._id, **memmappingexecutor_args)
../lib/python3.7/site-packages/joblib/executor.py:20: in get_memmapping_executor
return MemmappingExecutor.get_memmapping_executor(n_jobs, **kwargs)
../lib/python3.7/site-packages/joblib/executor.py:42: in get_memmapping_executor
manager = TemporaryResourcesManager(temp_folder)
../lib/python3.7/site-packages/joblib/_memmapping_reducer.py:531: in __init__
self.set_current_context(context_id)
../lib/python3.7/site-packages/joblib/_memmapping_reducer.py:535: in set_current_context
self.register_new_context(context_id)
../lib/python3.7/site-packages/joblib/_memmapping_reducer.py:560: in register_new_context
self.register_folder_finalizer(new_folder_path, context_id)
../lib/python3.7/site-packages/joblib/_memmapping_reducer.py:590: in register_folder_finalizer
resource_tracker.register(pool_subfolder, "folder")
../lib/python3.7/site-packages/joblib/externals/loky/backend/resource_tracker.py:190: in register
self.ensure_running()
../lib/python3.7/site-packages/joblib/externals/loky/backend/resource_tracker.py:162: in ensure_running
pid = spawnv_passfds(exe, args, fds_to_pass)
../lib/python3.7/site-packages/joblib/externals/loky/backend/resource_tracker.py:368: in spawnv_passfds
return fork_exec(args, _pass)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
cmd = ['/home/bernard/opt/python37/bin/python3.7', '-c', 'from joblib.externals.loky.backend.resource_tracker import main; main(12, False)'], keep_fds = [8, 12], env = {}
def fork_exec(cmd, keep_fds, env=None):
# copy the environment variables to set in the child process
env = {} if env is None else env
child_env = os.environ.copy()
child_env.update(env)
> pid = os.fork()
E OSError: [Errno 12] Cannot allocate memory
../lib/python3.7/site-packages/joblib/externals/loky/backend/fork_exec.py:43: OSError
I have 32G of ram.
@beew thanks. Perhaps our tests can be quite extreme on some systems, such as yours. I experience no problems on a 2017 Windows (!) laptop with only 16 GB RAM, or on a more modern MacBook with 16GB RAM, but the OS could be playing a part too. You can tweak the parameters to avoid ending up testing such extreme scenarios. In particular, in both tests you can try lowering the values of sigma
somewhat (of course, not too much in the case of test_hk_pi_big_sigma
without also changing the tolerance in the output, e.g. from 1e-4
to 1e-3
.
@beew are you happy with the suggestions above?
@ulupo
Hi, I tried build again on a fresh git pull yesterday and everything worked. Looks like you have removed the problematic test? Anyway I won't have access to that laptop for a month and the temporary replacement I am having now is quite weak for testing anything.
Hi @beew. I'm glad things are working out perfectly for you. We did not modify any test whatsoever, as I was saying above the issue you were experiencing has almost surely just to do with pathological example generation by hypothesis
(which is a good thing despite some nuisances sometimes). I hope you enjoy using and developing on Giotto-tda. I'll close the issue.
I see. I tried it on the replacement machine with only 4 G or ram and it works too though a bit slow. Thanks for the help!
I built and installed giotto-tda successfully but running
pytest gtda resulted in a lot of errors because 'gtda.externals.modules' are missing from the test suite/
python-3.7.8 on Ubuntu, compilld gtda from master.