pySCENIC is a lightning-fast python implementation of the SCENIC pipeline (Single-Cell rEgulatory Network Inference and Clustering) which enables biologists to infer transcription factors, gene regulatory networks and cell types from single-cell RNA-seq data.
Hi All,
I have followed the instructions and downloaded all the listed data from "https://pyscenic.readthedocs.io/en/latest/tutorial.html". However, in step "adjacencies = grnboost2(ex_matrix, tf_names=tf_names, verbose=True)". I got the error information like
distributed.worker - WARNING - Memory use is high but worker has no data to store to disk. Perhaps some other process is leaking memory? Process memory: 692.94 MB -- Worker memory limit: 858.99 MB
distributed.worker - ERROR - Tried to delete %s but no file found
Traceback (most recent call last):
File "/oak/stanford/groups/howchang/users/ydzhao/Conda_Library/scenic_protocol/lib/python3.6/site-packages/distributed/worker.py", line 2308, in release_key
del self.data[key]
File "/oak/stanford/groups/howchang/users/ydzhao/Conda_Library/scenic_protocol/lib/python3.6/site-packages/zict/buffer.py", line 97, in delitem
del self.slow[key]
File "/oak/stanford/groups/howchang/users/ydzhao/Conda_Library/scenic_protocol/lib/python3.6/site-packages/zict/func.py", line 47, in delitem
del self.d[key]
File "/oak/stanford/groups/howchang/users/ydzhao/Conda_Library/scenic_protocol/lib/python3.6/site-packages/zict/file.py", line 98, in delitem
os.remove(os.path.join(self.directory, _safe_key(key)))
FileNotFoundError: [Errno 2] No such file or directory: '/scratch/users/ydzhao/SCENIC/Resources/dask-worker-space/worker-9070u87c/storage/list-01ef2a72e3d1a12779e18fe5915fc3f4'
distributed.worker - ERROR - Tried to delete %s but no file found
Traceback (most recent call last):
File "/oak/stanford/groups/howchang/users/ydzhao/Conda_Library/scenic_protocol/lib/python3.6/site-packages/distributed/worker.py", line 2308, in release_key
del self.data[key]
File "/oak/stanford/groups/howchang/users/ydzhao/Conda_Library/scenic_protocol/lib/python3.6/site-packages/zict/buffer.py", line 97, in delitem
del self.slow[key]
File "/oak/stanford/groups/howchang/users/ydzhao/Conda_Library/scenic_protocol/lib/python3.6/site-packages/zict/func.py", line 47, in delitem
del self.d[key]
File "/oak/stanford/groups/howchang/users/ydzhao/Conda_Library/scenic_protocol/lib/python3.6/site-packages/zict/file.py", line 98, in delitem
os.remove(os.path.join(self.directory, _safe_key(key)))
FileNotFoundError: [Errno 2] No such file or directory: '/scratch/users/ydzhao/SCENIC/Resources/dask-worker-space/worker-9070u87c/storage/ndarray-39839034a5229025de3b4ac1993cfb1e'
distributed.worker - WARNING - Memory use is high but worker has no data to store to disk. Perhaps some other process is leaking memory? Process memory: 692.98 MB -- Worker memory limit: 858.99 MB
finished
Traceback (most recent call last):
File "", line 1, in
File "/oak/stanford/groups/howchang/users/ydzhao/Conda_Library/scenic_protocol/lib/python3.6/site-packages/arboreto/algo.py", line 41, in grnboost2
early_stop_window_length=early_stop_window_length, limit=limit, seed=seed, verbose=verbose)
File "/oak/stanford/groups/howchang/users/ydzhao/Conda_Library/scenic_protocol/lib/python3.6/site-packages/arboreto/algo.py", line 135, in diy
.compute(graph, sync=True) \
File "/oak/stanford/groups/howchang/users/ydzhao/Conda_Library/scenic_protocol/lib/python3.6/site-packages/distributed/client.py", line 2859, in compute
result = self.gather(futures)
File "/oak/stanford/groups/howchang/users/ydzhao/Conda_Library/scenic_protocol/lib/python3.6/site-packages/distributed/client.py", line 1969, in gather
asynchronous=asynchronous,
File "/oak/stanford/groups/howchang/users/ydzhao/Conda_Library/scenic_protocol/lib/python3.6/site-packages/distributed/client.py", line 838, in sync
self.loop, func, *args, callback_timeout=callback_timeout, **kwargs
File "/oak/stanford/groups/howchang/users/ydzhao/Conda_Library/scenic_protocol/lib/python3.6/site-packages/distributed/utils.py", line 351, in sync
raise exc.with_traceback(tb)
File "/oak/stanford/groups/howchang/users/ydzhao/Conda_Library/scenic_protocol/lib/python3.6/site-packages/distributed/utils.py", line 334, in f
result[0] = yield future
File "/oak/stanford/groups/howchang/users/ydzhao/Conda_Library/scenic_protocol/lib/python3.6/site-packages/tornado/gen.py", line 762, in run
value = future.result()
File "/oak/stanford/groups/howchang/users/ydzhao/Conda_Library/scenic_protocol/lib/python3.6/site-packages/distributed/client.py", line 1828, in _gather
raise exception.with_traceback(traceback)
distributed.scheduler.KilledWorker: ('str-80816c51b7ea3ba2b28497870ad2f1b2', <Worker 'tcp://127.0.0.1:35552', name: 0, memory: 0, processing: 35141>)
It seemed the memory is running out....I am running this on a HPC with 10cores and 32GB RAM for each core.....May I please ask is there anything to solve this. I have also attached my session information in below:
I am also getting the error and I have tried setting up pySCENIC through a conda environment, DOCKER, and Singularity. Getting the same errors as you. Changed the number of workers and still not working.
Hi All, I have followed the instructions and downloaded all the listed data from "https://pyscenic.readthedocs.io/en/latest/tutorial.html". However, in step "adjacencies = grnboost2(ex_matrix, tf_names=tf_names, verbose=True)". I got the error information like
distributed.worker - WARNING - Memory use is high but worker has no data to store to disk. Perhaps some other process is leaking memory? Process memory: 692.94 MB -- Worker memory limit: 858.99 MB distributed.worker - ERROR - Tried to delete %s but no file found Traceback (most recent call last): File "/oak/stanford/groups/howchang/users/ydzhao/Conda_Library/scenic_protocol/lib/python3.6/site-packages/distributed/worker.py", line 2308, in release_key del self.data[key] File "/oak/stanford/groups/howchang/users/ydzhao/Conda_Library/scenic_protocol/lib/python3.6/site-packages/zict/buffer.py", line 97, in delitem del self.slow[key] File "/oak/stanford/groups/howchang/users/ydzhao/Conda_Library/scenic_protocol/lib/python3.6/site-packages/zict/func.py", line 47, in delitem del self.d[key] File "/oak/stanford/groups/howchang/users/ydzhao/Conda_Library/scenic_protocol/lib/python3.6/site-packages/zict/file.py", line 98, in delitem os.remove(os.path.join(self.directory, _safe_key(key))) FileNotFoundError: [Errno 2] No such file or directory: '/scratch/users/ydzhao/SCENIC/Resources/dask-worker-space/worker-9070u87c/storage/list-01ef2a72e3d1a12779e18fe5915fc3f4' distributed.worker - ERROR - Tried to delete %s but no file found Traceback (most recent call last): File "/oak/stanford/groups/howchang/users/ydzhao/Conda_Library/scenic_protocol/lib/python3.6/site-packages/distributed/worker.py", line 2308, in release_key del self.data[key] File "/oak/stanford/groups/howchang/users/ydzhao/Conda_Library/scenic_protocol/lib/python3.6/site-packages/zict/buffer.py", line 97, in delitem del self.slow[key] File "/oak/stanford/groups/howchang/users/ydzhao/Conda_Library/scenic_protocol/lib/python3.6/site-packages/zict/func.py", line 47, in delitem del self.d[key] File "/oak/stanford/groups/howchang/users/ydzhao/Conda_Library/scenic_protocol/lib/python3.6/site-packages/zict/file.py", line 98, in delitem os.remove(os.path.join(self.directory, _safe_key(key))) FileNotFoundError: [Errno 2] No such file or directory: '/scratch/users/ydzhao/SCENIC/Resources/dask-worker-space/worker-9070u87c/storage/ndarray-39839034a5229025de3b4ac1993cfb1e' distributed.worker - WARNING - Memory use is high but worker has no data to store to disk. Perhaps some other process is leaking memory? Process memory: 692.98 MB -- Worker memory limit: 858.99 MB finished Traceback (most recent call last): File "", line 1, in
File "/oak/stanford/groups/howchang/users/ydzhao/Conda_Library/scenic_protocol/lib/python3.6/site-packages/arboreto/algo.py", line 41, in grnboost2
early_stop_window_length=early_stop_window_length, limit=limit, seed=seed, verbose=verbose)
File "/oak/stanford/groups/howchang/users/ydzhao/Conda_Library/scenic_protocol/lib/python3.6/site-packages/arboreto/algo.py", line 135, in diy
.compute(graph, sync=True) \
File "/oak/stanford/groups/howchang/users/ydzhao/Conda_Library/scenic_protocol/lib/python3.6/site-packages/distributed/client.py", line 2859, in compute
result = self.gather(futures)
File "/oak/stanford/groups/howchang/users/ydzhao/Conda_Library/scenic_protocol/lib/python3.6/site-packages/distributed/client.py", line 1969, in gather
asynchronous=asynchronous,
File "/oak/stanford/groups/howchang/users/ydzhao/Conda_Library/scenic_protocol/lib/python3.6/site-packages/distributed/client.py", line 838, in sync
self.loop, func, *args, callback_timeout=callback_timeout, **kwargs
File "/oak/stanford/groups/howchang/users/ydzhao/Conda_Library/scenic_protocol/lib/python3.6/site-packages/distributed/utils.py", line 351, in sync
raise exc.with_traceback(tb)
File "/oak/stanford/groups/howchang/users/ydzhao/Conda_Library/scenic_protocol/lib/python3.6/site-packages/distributed/utils.py", line 334, in f
result[0] = yield future
File "/oak/stanford/groups/howchang/users/ydzhao/Conda_Library/scenic_protocol/lib/python3.6/site-packages/tornado/gen.py", line 762, in run
value = future.result()
File "/oak/stanford/groups/howchang/users/ydzhao/Conda_Library/scenic_protocol/lib/python3.6/site-packages/distributed/client.py", line 1828, in _gather
raise exception.with_traceback(traceback)
distributed.scheduler.KilledWorker: ('str-80816c51b7ea3ba2b28497870ad2f1b2', <Worker 'tcp://127.0.0.1:35552', name: 0, memory: 0, processing: 35141>)
It seemed the memory is running out....I am running this on a HPC with 10cores and 32GB RAM for each core.....May I please ask is there anything to solve this. I have also attached my session information in below:
session_info.show()
arboreto NA ctxcore 0.1.1 dask 2021.03.0 natsort 7.1.1 numpy 1.19.5 pandas 1.1.5 pyscenic 0.11.2 seaborn 0.11.1 session_info 1.0.0
Python 3.6.13 | packaged by conda-forge | (default, Feb 19 2021, 05:36:01) [GCC 9.3.0] Linux-3.10.0-1160.36.2.el7.x86_64-x86_64-with-centos-7.9.2009-Core
Session information updated at 2021-08-04 22:05