Open vdet opened 1 year ago
@vdet,
Can you provide a few lines of Python demonstrating how exactly you are creating and configuring the Dask cluster? We have indeed been troubleshooting a bug where UCX (or NCCL) appears to be deadlocking and it'll help if we can determine if this is related.
sure. I use this config:
os.environ["DASK_RMM__POOL_SIZE"] = "1GB"
os.environ["DASK_UCX__CUDA_COPY"] = "True"
os.environ["DASK_UCX__TCP"] = "True"
os.environ["DASK_UCX__NVLINK"] = "True"
os.environ["DASK_UCX__INFINIBAND"] = "True"
os.environ["DASK_UCX__NET_DEVICES"] = "ib0"
Note that when I load the library I got some warnings:
import cuml
import cupy
from dask.distributed import Client
import dask.array as da
from dask_cuda import LocalCUDACluster
from cuml.dask.neighbors import NearestNeighbors
/opt/conda/envs/rapids/lib/python3.9/site-packages/dask/config.py:660: UserWarning: Configuration key "ucx.cuda_copy" has been deprecated. Please use "distributed.ucx.cuda_copy" instead
warnings.warn(
/opt/conda/envs/rapids/lib/python3.9/site-packages/dask/config.py:660: UserWarning: Configuration key "ucx.tcp" has been deprecated. Please use "distributed.ucx.tcp" instead
warnings.warn(
/opt/conda/envs/rapids/lib/python3.9/site-packages/dask/config.py:660: UserWarning: Configuration key "ucx.nvlink" has been deprecated. Please use "distributed.ucx.nvlink" instead
warnings.warn(
/opt/conda/envs/rapids/lib/python3.9/site-packages/dask/config.py:660: UserWarning: Configuration key "ucx.infiniband" has been deprecated. Please use "distributed.ucx.infiniband" instead
warnings.warn(
from distributed import Clie
The client is created with
client = Client(LocalCUDACluster())
It does detects all 8 GPUs, all are used indeed.
These commands
distances, indices = model.kneighbors(da_df)
distances, indices = client.compute([distances, indices])
produce warnings that could be relevant:
2022-12-05 17:06:31,691 - distributed.scheduler - WARNING - Worker failed to heartbeat within 300 seconds. Closing: <WorkerState 'tcp://127.0.0.1:34329', name: 0, status: running, memory: 2, processing: 1>
2022-12-05 17:06:31,695 - distributed.scheduler - WARNING - Worker failed to heartbeat within 300 seconds. Closing: <WorkerState 'tcp://127.0.0.1:35865', name: 5, status: running, memory: 2, processing: 1>
2022-12-05 17:06:31,697 - distributed.scheduler - WARNING - Worker failed to heartbeat within 300 seconds. Closing: <WorkerState 'tcp://127.0.0.1:39259', name: 3, status: running, memory: 2, processing: 1>
2022-12-05 17:06:31,700 - distributed.scheduler - WARNING - Worker failed to heartbeat within 300 seconds. Closing: <WorkerState 'tcp://127.0.0.1:41763', name: 2, status: running, memory: 2, processing: 1>
2022-12-05 17:06:50,610 - distributed.nanny - WARNING - Worker process still alive after 3.1999963378906253 seconds, killing
2022-12-05 17:06:50,611 - distributed.nanny - WARNING - Worker process still alive after 3.1999980163574224 seconds, killing
2022-12-05 17:06:50,613 - distributed.nanny - WARNING - Worker process still alive after 3.1999960327148442 seconds, killing
2022-12-05 17:06:50,614 - distributed.nanny - WARNING - Worker process still alive after 3.1999981689453127 seconds, killing
2022-12-05 17:06:52,257 - distributed.nanny - ERROR - Error in Nanny killing Worker subprocess
Traceback (most recent call last):
File "/opt/conda/envs/rapids/lib/python3.9/asyncio/tasks.py", line 490, in wait_for
return fut.result()
asyncio.exceptions.CancelledError
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/opt/conda/envs/rapids/lib/python3.9/site-packages/distributed/nanny.py", line 603, in close
await self.kill(timeout=timeout, reason=reason)
File "/opt/conda/envs/rapids/lib/python3.9/site-packages/distributed/nanny.py", line 388, in kill
await self.process.kill(reason=reason, timeout=0.8 * (deadline - time()))
File "/opt/conda/envs/rapids/lib/python3.9/site-packages/distributed/nanny.py", line 835, in kill
await process.join(max(0, deadline - time()))
File "/opt/conda/envs/rapids/lib/python3.9/site-packages/distributed/process.py", line 316, in join
await asyncio.wait_for(asyncio.shield(self._exit_future), timeout)
File "/opt/conda/envs/rapids/lib/python3.9/asyncio/tasks.py", line 492, in wait_for
raise exceptions.TimeoutError() from exc
asyncio.exceptions.TimeoutError
2022-12-05 17:06:52,260 - distributed.nanny - ERROR - Error in Nanny killing Worker subprocess
Traceback (most recent call last):
File "/opt/conda/envs/rapids/lib/python3.9/asyncio/tasks.py", line 490, in wait_for
return fut.result()
asyncio.exceptions.CancelledError
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/opt/conda/envs/rapids/lib/python3.9/site-packages/distributed/nanny.py", line 603, in close
await self.kill(timeout=timeout, reason=reason)
File "/opt/conda/envs/rapids/lib/python3.9/site-packages/distributed/nanny.py", line 388, in kill
await self.process.kill(reason=reason, timeout=0.8 * (deadline - time()))
File "/opt/conda/envs/rapids/lib/python3.9/site-packages/distributed/nanny.py", line 835, in kill
await process.join(max(0, deadline - time()))
File "/opt/conda/envs/rapids/lib/python3.9/site-packages/distributed/process.py", line 316, in join
await asyncio.wait_for(asyncio.shield(self._exit_future), timeout)
File "/opt/conda/envs/rapids/lib/python3.9/asyncio/tasks.py", line 492, in wait_for
raise exceptions.TimeoutError() from exc
asyncio.exceptions.TimeoutError
2022-12-05 17:06:52,260 - distributed.nanny - ERROR - Error in Nanny killing Worker subprocess
Traceback (most recent call last):
File "/opt/conda/envs/rapids/lib/python3.9/asyncio/tasks.py", line 490, in wait_for
return fut.result()
asyncio.exceptions.CancelledError
[...]
Still, the GPUs are launched, the computation seems to hang after GPUs have done their jobs and their memory is emptied (nvvidia-smi reporting 2M/GPU, down from ~18000M).
@pentschev this looks exactly like what we were seeing in the pytests last week.
@vdet out of curiosity, what version of NCCL and UCX do you have installed? (conda env list
if using conda).
Assuming you're only creating a cluster as in https://github.com/rapidsai/cuml/issues/5056#issuecomment-1338080046 , i.e., LocalCUDACluster()
, then all environment variables you set above are irrelevant because you didn't specify protocol="ucx"
. That would effectively mean the hang can't be due to UCX.
@cjnolet I think we discussed that last week and you said you only use UCX if Dask also uses UCX, but could you confirm that again?
@pentschev, we do use UCX in cuML even if Dask has not been configured w/ it, however, we do also rely on Dask to configure UCX unless the corresponding configuration values are specified manually by the user (by using UCXPY_..._
for example).
@vdet, as a side note the warnings come from changes in the dask-cuda API. Parameters can be set in the following ways :
os.environ["DASK_DISTRIBUTED__COMM__UCX__TCP"] = "True"
or distributed.comm.ucx.tcp = True
or even with specific arguments to the LocalCUDACluster.
In case computations went fine, it might also simply be a matter of properly shutting the client and cluster down in some instances.
client.shutdown()
cluster.close()
Also, looking at the errors, I would say that there might possibly be issues in the settings for NVLink
and Infiniband
on your machine. Could you try disabling all options?
@viclafargue can you verify that you are able to consistently run the nearest neighbors (and dbscan) pytests successfully on multiple gpus? @pentschev and I were reproducing the same behavior reported in this issue by running the pytests alone in cuda 11.xx. I think this is a legitimate problem that popped up sometime (recently) in either dask, ucx, or nccl. AFAIK, the spark-cuml folks are not experiencing this issue.
Ok, my bad in this case, but I wasn't able to reproduce the issue on my side (2 GPUs machine, CUDA 11.4), not with the pytests nor with a small script of my own. The only thing that I could notice was the issue that was discussed internally with the CUDA context :
distributed.comm.ucx - WARNING - A CUDA context for device 0 (b'GPU-c011dfcc-17f7-05b7-0826-983e4122e07e') already exists on process ID 38936. This is often the result of a CUDA-enabled library calling a CUDA runtime function before Dask-CUDA can spawn worker processes. Please make sure any such function calls don't happen at import time or in the global scope of a program.
Thank you all for your help!
kneighbors
actually complete if the size of the query is smaller than the batch_size
in
model = cuDistNearestNeighbors(client=client, n_neighbors=15, batch_size=1000000)
So the problem arises when collating the results from different batches. I simply broke my query in smaller pieces, then concatenated the results. So my problem is solved for now. I got the much needed graph in 23min (vs. no completion after several days with sklearn, dataset too big for single processor cuml). Thanks for this amazing code.
@viclafargue, the warnings are gone thanks to your suggestion. I too get
distributed.comm.ucx - WARNING - A CUDA context for device 0 (b'GPU-c011dfcc-17f7-05b7-0826-983e4122e07e') already exists on process ID 38936. This is often the result of a CUDA-enabled library calling a CUDA runtime function before Dask-CUDA can spawn worker processes. Please make sure any such function calls don't happen at import time or in the global scope of a program.
even after killing/restarting my notebook.
@cjnolet, I am working from a recent nightly released container installed last week with:
sudo singularity build rapidsai.sif docker://nvcr.io/nvidia/rapidsai/rapidsai-core-dev:22.10-cuda11.5-devel-ubuntu20.04-py3.9
I am no conda expert, to say the least, the command you suggest does not seems to provide useful information:
(rapids) vdet@gpu03:~$ conda env list
# conda environments:
#
base /opt/conda
rapids * /opt/conda/envs/rapids
Any way to get the info from within python?
kneighbors actually complete if the size of the query is smaller than the batch_size in
Great! I'm glad you are able to run this now! We still have some work to do to isolate the root cause here and I still think it might be related to some other hangs we're seeing in the pytests.
Any way to get the info from within python?
My apologies, @vdet. I meant to use conda list
When using Dask I would suggest moving away from manually setting environment variables. Dask has the ability now to use UCX defaults which is less error-prone and simpler to setup, we have docs explaining that for LocalCUDACluster and dask-cuda-worker.
So the problem arises when collating the results from different batches. I simply broke my query in smaller pieces, then concatenated the results.
One thing I notice from running some of my own tests is that the heartbeats from the workers appear to stall while they are computing each task. This makes me wonder if the tasks are causing the worker timeouts once the processing time becomes large enough (300 seconds by default, I believe).
@cjnolet here is the result of the command:
(rapids) vdet@gpu02:~$ conda list
# packages in environment at /opt/conda/envs/rapids:
#
# Name Version Build Channel
_libgcc_mutex 0.1 conda_forge conda-forge
_openmp_mutex 4.5 2_gnu conda-forge
aiobotocore 2.4.0 pyhd8ed1ab_0 conda-forge
aiohttp 3.8.3 py39hb9d737c_1 conda-forge
aioitertools 0.11.0 pyhd8ed1ab_0 conda-forge
aiosignal 1.3.1 pyhd8ed1ab_0 conda-forge
anyio 3.6.2 pyhd8ed1ab_0 conda-forge
aom 3.5.0 h27087fc_0 conda-forge
appdirs 1.4.4 pyh9f0ad1d_0 conda-forge
argon2-cffi 21.3.0 pyhd8ed1ab_0 conda-forge
argon2-cffi-bindings 21.2.0 py39hb9d737c_3 conda-forge
arrow-cpp 9.0.0 py39hd3ccb9b_2_cpu conda-forge
async-timeout 4.0.2 pyhd8ed1ab_0 conda-forge
attrs 22.1.0 pyh71513ae_1 conda-forge
aws-c-cal 0.5.11 h95a6274_0 conda-forge
aws-c-common 0.6.2 h7f98852_0 conda-forge
aws-c-event-stream 0.2.7 h3541f99_13 conda-forge
aws-c-io 0.10.5 hfb6a706_0 conda-forge
aws-checksums 0.1.11 ha31a3da_7 conda-forge
aws-sdk-cpp 1.8.186 hecaee15_4 conda-forge
babel 2.11.0 pyhd8ed1ab_0 conda-forge
backcall 0.2.0 pyh9f0ad1d_0 conda-forge
backports 1.0 pyhd8ed1ab_3 conda-forge
backports.functools_lru_cache 1.6.4 pyhd8ed1ab_0 conda-forge
beautifulsoup4 4.11.1 pyha770c72_0 conda-forge
blas 1.1 openblas conda-forge
bleach 5.0.1 pyhd8ed1ab_0 conda-forge
blosc 1.21.1 h83bc5f7_3 conda-forge
bokeh 2.4.3 pyhd8ed1ab_3 conda-forge
boost 1.78.0 py39h7c9e3ff_4 conda-forge
boost-cpp 1.78.0 h75c5d50_1 conda-forge
botocore 1.27.59 pyhd8ed1ab_0 conda-forge
branca 0.6.0 pyhd8ed1ab_0 conda-forge
brotli 1.0.9 h166bdaf_8 conda-forge
brotli-bin 1.0.9 h166bdaf_8 conda-forge
brotlipy 0.7.0 py39hb9d737c_1005 conda-forge
brunsli 0.1 h9c3ff4c_0 conda-forge
bzip2 1.0.8 h7f98852_4 conda-forge
c-ares 1.18.1 h7f98852_0 conda-forge
c-blosc2 2.5.0 h7a311fb_0 conda-forge
ca-certificates 2022.9.24 ha878542_0 conda-forge
cachetools 5.2.0 pyhd8ed1ab_0 conda-forge
cairo 1.16.0 ha61ee94_1014 conda-forge
certifi 2022.9.24 pyhd8ed1ab_0 conda-forge
cffi 1.15.1 py39he91dace_2 conda-forge
cfitsio 4.2.0 hd9d235c_0 conda-forge
charls 2.3.4 h9c3ff4c_0 conda-forge
charset-normalizer 2.1.1 pyhd8ed1ab_0 conda-forge
click 8.1.3 unix_pyhd8ed1ab_2 conda-forge
click-plugins 1.1.1 py_0 conda-forge
cligj 0.7.2 pyhd8ed1ab_1 conda-forge
cloudpickle 2.2.0 pyhd8ed1ab_0 conda-forge
colorama 0.4.6 pyhd8ed1ab_0 conda-forge
colorcet 3.0.1 pyhd8ed1ab_0 conda-forge
comm 0.1.1 pyhd8ed1ab_0 conda-forge
contourpy 1.0.6 py39hf939315_0 conda-forge
cryptography 38.0.4 py39hd97740a_0 conda-forge
cubinlinker 0.2.0 py39h11215e4_1 rapidsai
cucim 22.12.00a221130 cuda_11_py39_g8b3df72_27 rapidsai-nightly
cuda-python 11.7.1 py39h1eff087_1 conda-forge
cudatoolkit 11.5.1 h59c8dcf_10 conda-forge
cudf 22.12.00a221129 cuda_11_py39_geb271044c2_307 rapidsai-nightly
cudf_kafka 22.12.00a221129 py39_geb271044c2_307 rapidsai-nightly
cugraph 22.12.00a221130 cuda11_py39_g60f5e7bc_145 rapidsai-nightly
cuml 22.12.00a221129 cuda11_py39_g07f0bc4ba_49 rapidsai-nightly
cupy 11.3.0 py39hc3c280e_1 conda-forge
curl 7.86.0 h7bff187_1 conda-forge
cusignal 22.12.00a221130 py38_gdd26cc1_14 rapidsai-nightly
cuspatial 22.12.00a221130 py39_g924b570_76 rapidsai-nightly
custreamz 22.12.00a221129 py39_geb271044c2_307 rapidsai-nightly
cuxfilter 22.12.00a221129 py39_g8726496_8 rapidsai-nightly
cycler 0.11.0 pyhd8ed1ab_0 conda-forge
cyrus-sasl 2.1.27 h230043b_5 conda-forge
cython 0.29.32 py39h5a03fae_1 conda-forge
cytoolz 0.12.0 py39hb9d737c_1 conda-forge
dask 2022.11.2a221129 py_gc23ee621_16 dask/label/dev
dask-core 2022.11.2a221125 py_g3ac3b8d6e_9 dask/label/dev
dask-cuda 22.12.00a221130 py39_g55375b8_33 rapidsai-nightly
dask-cudf 22.12.00a221129 cuda_11_py39_geb271044c2_307 rapidsai-nightly
dask-glm 0.2.0 py_1 conda-forge
dask-labextension 6.0.0 pyhd8ed1ab_0 conda-forge
dask-ml 1.9.0 pyhd8ed1ab_0 conda-forge
datashader 0.13.1a py_0 rapidsai
datashape 0.5.4 py_1 conda-forge
dav1d 1.0.0 h166bdaf_1 conda-forge
debugpy 1.6.4 py39h5a03fae_0 conda-forge
decorator 5.1.1 pyhd8ed1ab_0 conda-forge
defusedxml 0.7.1 pyhd8ed1ab_0 conda-forge
distributed 2022.11.2a221129 py_gc23ee621_16 dask/label/dev
dlpack 0.5 h9c3ff4c_0 conda-forge
entrypoints 0.4 pyhd8ed1ab_0 conda-forge
exceptiongroup 1.0.4 pyhd8ed1ab_0 conda-forge
expat 2.5.0 h27087fc_0 conda-forge
faiss-proc 1.0.0 cuda rapidsai
fastavro 1.7.0 py39hb9d737c_0 conda-forge
fastrlock 0.8 py39h5a03fae_3 conda-forge
feather-format 0.4.1 pypi_0 pypi
filterpy 1.4.5 py_1 conda-forge
fiona 1.8.22 py39h80939cc_2 conda-forge
flit-core 3.8.0 pyhd8ed1ab_0 conda-forge
folium 0.13.0 pyhd8ed1ab_0 conda-forge
font-ttf-dejavu-sans-mono 2.37 hab24e00_0 conda-forge
font-ttf-inconsolata 3.000 h77eed37_0 conda-forge
font-ttf-source-code-pro 2.038 h77eed37_0 conda-forge
font-ttf-ubuntu 0.83 hab24e00_0 conda-forge
fontconfig 2.14.1 hc2a2eb6_0 conda-forge
fonts-conda-ecosystem 1 0 conda-forge
fonts-conda-forge 1 0 conda-forge
fonttools 4.38.0 py39hb9d737c_1 conda-forge
freetype 2.12.1 hca18f0e_1 conda-forge
freexl 1.0.6 h166bdaf_1 conda-forge
frozenlist 1.3.3 py39hb9d737c_0 conda-forge
fsspec 2022.11.0 pyhd8ed1ab_0 conda-forge
gdal 3.5.3 py39h92c1d47_5 conda-forge
gdk-pixbuf 2.42.8 hff1cb4f_1 conda-forge
geopandas 0.12.1 pyhd8ed1ab_1 conda-forge
geopandas-base 0.12.1 pyha770c72_1 conda-forge
geos 3.11.1 h27087fc_0 conda-forge
geotiff 1.7.1 ha76d385_4 conda-forge
gettext 0.21.1 h27087fc_0 conda-forge
gflags 2.2.2 he1b5a44_1004 conda-forge
giflib 5.2.1 h36c2ea0_2 conda-forge
git 2.38.1 pl5321h5e804b7_1 conda-forge
glib 2.74.1 h6239696_1 conda-forge
glib-tools 2.74.1 h6239696_1 conda-forge
glog 0.6.0 h6f12383_0 conda-forge
grpc-cpp 1.47.1 hbad87ad_6 conda-forge
hdf4 4.2.15 h9772cbc_5 conda-forge
hdf5 1.12.2 nompi_h2386368_100 conda-forge
heapdict 1.0.1 py_0 conda-forge
holoviews 1.14.6 pyhd8ed1ab_0 conda-forge
icu 70.1 h27087fc_0 conda-forge
idna 3.4 pyhd8ed1ab_0 conda-forge
imagecodecs 2022.9.26 py39hf32c164_4 conda-forge
imageio 2.22.4 pyhfa7a67d_1 conda-forge
importlib-metadata 5.1.0 pyha770c72_0 conda-forge
importlib_resources 5.10.0 pyhd8ed1ab_0 conda-forge
iniconfig 1.1.1 pyh9f0ad1d_0 conda-forge
ipykernel 6.18.2 pyh210e3f2_0 conda-forge
ipython 7.31.1 py39hf3d152e_0 conda-forge
ipython_genutils 0.2.0 py_1 conda-forge
ipywidgets 8.0.2 pyhd8ed1ab_1 conda-forge
jbig 2.1 h7f98852_2003 conda-forge
jedi 0.18.2 pyhd8ed1ab_0 conda-forge
jinja2 3.1.2 pyhd8ed1ab_1 conda-forge
jmespath 1.0.1 pyhd8ed1ab_0 conda-forge
joblib 1.2.0 pyhd8ed1ab_0 conda-forge
jpeg 9e h166bdaf_2 conda-forge
json-c 0.16 hc379101_0 conda-forge
json5 0.9.5 pyh9f0ad1d_0 conda-forge
jsonschema 4.17.1 pyhd8ed1ab_0 conda-forge
jupyter-packaging 0.7.12 pyhd8ed1ab_0 conda-forge
jupyter-server-proxy 3.2.2 pyhd8ed1ab_0 conda-forge
jupyter_client 7.3.4 pyhd8ed1ab_0 conda-forge
jupyter_core 5.0.0 py39hf3d152e_0 conda-forge
jupyter_server 1.23.3 pyhd8ed1ab_0 conda-forge
jupyterlab 3.5.0 pyhd8ed1ab_0 conda-forge
jupyterlab-favorites 3.1.0 pyhd8ed1ab_0 conda-forge
jupyterlab-nvdashboard 0.7.0 py_0 rapidsai
jupyterlab_pygments 0.2.2 pyhd8ed1ab_0 conda-forge
jupyterlab_server 2.16.3 pyhd8ed1ab_0 conda-forge
jupyterlab_widgets 3.0.3 pyhd8ed1ab_0 conda-forge
jxrlib 1.1 h7f98852_2 conda-forge
kealib 1.5.0 ha7026e8_0 conda-forge
keyutils 1.6.1 h166bdaf_0 conda-forge
kiwisolver 1.4.4 py39hf939315_1 conda-forge
krb5 1.19.3 h3790be6_0 conda-forge
lcms2 2.14 h6ed2654_0 conda-forge
ld_impl_linux-64 2.39 hcc3a1bd_1 conda-forge
lerc 4.0.0 h27087fc_0 conda-forge
libabseil 20220623.0 cxx17_h48a1fff_5 conda-forge
libaec 1.0.6 h9c3ff4c_0 conda-forge
libavif 0.11.1 h5cdd6b5_0 conda-forge
libblas 3.9.0 16_linux64_openblas conda-forge
libbrotlicommon 1.0.9 h166bdaf_8 conda-forge
libbrotlidec 1.0.9 h166bdaf_8 conda-forge
libbrotlienc 1.0.9 h166bdaf_8 conda-forge
libcblas 3.9.0 16_linux64_openblas conda-forge
libcrc32c 1.1.2 h9c3ff4c_0 conda-forge
libcucim 22.12.00a221130 cuda11_g8b3df72_27 rapidsai-nightly
libcudf 22.12.00a221129 cuda11_geb271044c2_307 rapidsai-nightly
libcudf_kafka 22.12.00a221129 geb271044c2_307 rapidsai-nightly
libcugraph 22.12.00a221130 cuda11_g60f5e7bc_145 rapidsai-nightly
libcugraph_etl 22.12.00a221130 cuda11_g60f5e7bc_145 rapidsai-nightly
libcugraphops 22.12.00a221129 cuda11_gdc2e2cf_29 rapidsai-nightly
libcuml 22.12.00a221129 cuda11_g6468f0ac2_50 rapidsai-nightly
libcumlprims 22.12.00a221010 cuda11_geaadb5e_2 rapidsai-nightly
libcurl 7.86.0 h7bff187_1 conda-forge
libcusolver 11.4.1.48 0 nvidia
libcusparse 11.7.5.86 0 nvidia
libcuspatial 22.12.00a221130 cuda11_g924b570_76 rapidsai-nightly
libdap4 3.20.6 hd7c4107_2 conda-forge
libdeflate 1.14 h166bdaf_0 conda-forge
libedit 3.1.20191231 he28a2e2_2 conda-forge
libev 4.33 h516909a_1 conda-forge
libevent 2.1.10 h9b69904_4 conda-forge
libfaiss 1.7.0 cuda112h5bea7ad_8_cuda conda-forge
libffi 3.4.2 h7f98852_5 conda-forge
libgcc-ng 12.2.0 h65d4601_19 conda-forge
libgcrypt 1.10.1 h166bdaf_0 conda-forge
libgdal 3.5.3 h669b3df_5 conda-forge
libgfortran-ng 12.2.0 h69a702a_19 conda-forge
libgfortran5 12.2.0 h337968e_19 conda-forge
libglib 2.74.1 h606061b_1 conda-forge
libgomp 12.2.0 h65d4601_19 conda-forge
libgoogle-cloud 2.1.0 h9ebe8e8_2 conda-forge
libgpg-error 1.45 hc0c96e0_0 conda-forge
libgsasl 1.10.0 h5b4c23d_0 conda-forge
libiconv 1.17 h166bdaf_0 conda-forge
libkml 1.3.0 h37653c0_1015 conda-forge
liblapack 3.9.0 16_linux64_openblas conda-forge
libllvm11 11.1.0 he0ac6c6_5 conda-forge
libnetcdf 4.8.1 nompi_h261ec11_106 conda-forge
libnghttp2 1.47.0 hdcd2b5c_1 conda-forge
libnsl 2.0.0 h7f98852_0 conda-forge
libntlm 1.4 h7f98852_1002 conda-forge
libopenblas 0.3.21 pthreads_h78a6416_3 conda-forge
libpng 1.6.39 h753d276_0 conda-forge
libpq 15.1 hd77ab85_0 conda-forge
libprotobuf 3.20.2 h6239696_0 conda-forge
libraft-distance 22.12.00a221129 cuda11_g11c5105_136 rapidsai-nightly
libraft-headers 22.12.00a221129 cuda11_g11c5105_136 rapidsai-nightly
libraft-nn 22.12.00a221129 cuda11_g11c5105_136 rapidsai-nightly
librdkafka 1.7.0 hc49e61c_1 conda-forge
librmm 22.12.00a221130 cuda11_gda7036aa_57 rapidsai-nightly
librttopo 1.1.0 ha49c73b_12 conda-forge
libsodium 1.0.18 h36c2ea0_1 conda-forge
libspatialindex 1.9.3 h9c3ff4c_4 conda-forge
libspatialite 5.0.1 h7c8129e_22 conda-forge
libsqlite 3.40.0 h753d276_0 conda-forge
libssh2 1.10.0 haa6b8db_3 conda-forge
libstdcxx-ng 12.2.0 h46fd767_19 conda-forge
libthrift 0.16.0 h491838f_2 conda-forge
libtiff 4.4.0 h55922b4_4 conda-forge
libutf8proc 2.8.0 h166bdaf_0 conda-forge
libuuid 2.32.1 h7f98852_1000 conda-forge
libuv 1.44.2 h166bdaf_0 conda-forge
libwebp 1.2.4 h522a892_0 conda-forge
libwebp-base 1.2.4 h166bdaf_0 conda-forge
libxcb 1.13 h7f98852_1004 conda-forge
libxgboost 1.6.2dev.rapidsai22.12 cuda_11_0 rapidsai-nightly
libxml2 2.10.3 h7463322_0 conda-forge
libzip 1.9.2 hc869a4a_1 conda-forge
libzlib 1.2.13 h166bdaf_4 conda-forge
libzopfli 1.0.3 h9c3ff4c_0 conda-forge
llvmlite 0.39.1 py39h7d9a04d_1 conda-forge
locket 1.0.0 pyhd8ed1ab_0 conda-forge
lz4 4.0.2 py39h029007f_0 conda-forge
lz4-c 1.9.3 h9c3ff4c_1 conda-forge
mapclassify 2.4.3 pyhd8ed1ab_0 conda-forge
markdown 3.4.1 pyhd8ed1ab_0 conda-forge
markupsafe 2.1.1 py39hb9d737c_2 conda-forge
matplotlib-base 3.6.2 py39hf9fd14e_0 conda-forge
matplotlib-inline 0.1.6 pyhd8ed1ab_0 conda-forge
mistune 2.0.4 pyhd8ed1ab_0 conda-forge
msgpack-python 1.0.4 py39hf939315_1 conda-forge
multidict 6.0.2 py39hb9d737c_2 conda-forge
multipledispatch 0.6.0 py_0 conda-forge
munch 2.5.0 py_0 conda-forge
munkres 1.1.4 pyh9f0ad1d_0 conda-forge
nbclassic 0.4.8 pyhd8ed1ab_0 conda-forge
nbclient 0.7.0 pyhd8ed1ab_0 conda-forge
nbconvert 7.2.5 pyhd8ed1ab_0 conda-forge
nbconvert-core 7.2.5 pyhd8ed1ab_0 conda-forge
nbconvert-pandoc 7.2.5 pyhd8ed1ab_0 conda-forge
nbformat 5.7.0 pyhd8ed1ab_0 conda-forge
nccl 2.14.3.1 h0800d71_0 conda-forge
ncurses 6.3 h27087fc_1 conda-forge
nest-asyncio 1.5.6 pyhd8ed1ab_0 conda-forge
networkx 2.6.3 pyhd8ed1ab_1 conda-forge
nodejs 18.12.1 h96d913c_0 conda-forge
notebook 6.5.2 pyha770c72_1 conda-forge
notebook-shim 0.2.2 pyhd8ed1ab_0 conda-forge
nspr 4.35 h27087fc_0 conda-forge
nss 3.82 he02c5a1_0 conda-forge
numba 0.56.4 py39h61ddf18_0 conda-forge
numpy 1.23.5 py39h3d75532_0 conda-forge
nvtx 0.2.3 py39hb9d737c_2 conda-forge
openblas 0.3.21 pthreads_h320a7e8_3 conda-forge
openjpeg 2.5.0 h7d73246_1 conda-forge
openslide 3.4.1 h71beb9a_5 conda-forge
openssl 1.1.1s h166bdaf_0 conda-forge
orc 1.7.6 h6c59b99_0 conda-forge
packaging 21.3 pyhd8ed1ab_0 conda-forge
pandas 1.5.2 py39h4661b88_0 conda-forge
pandoc 2.19.2 h32600fe_1 conda-forge
pandocfilters 1.5.0 pyhd8ed1ab_0 conda-forge
panel 0.12.7 pyhd8ed1ab_0 conda-forge
param 1.12.2 pyh6c4a22f_0 conda-forge
parquet-cpp 1.5.1 2 conda-forge
parso 0.8.3 pyhd8ed1ab_0 conda-forge
partd 1.3.0 pyhd8ed1ab_0 conda-forge
patsy 0.5.3 pyhd8ed1ab_0 conda-forge
pcre 8.45 h9c3ff4c_0 conda-forge
pcre2 10.40 hc3806b6_0 conda-forge
perl 5.32.1 2_h7f98852_perl5 conda-forge
pexpect 4.8.0 pyh1a96a4e_2 conda-forge
pickleshare 0.7.5 py_1003 conda-forge
pillow 9.2.0 py39hf3a2cdf_3 conda-forge
pip 22.3.1 pyhd8ed1ab_0 conda-forge
pixman 0.40.0 h36c2ea0_0 conda-forge
pkgutil-resolve-name 1.3.10 pyhd8ed1ab_0 conda-forge
platformdirs 2.5.2 pyhd8ed1ab_1 conda-forge
pluggy 1.0.0 pyhd8ed1ab_5 conda-forge
poppler 22.11.0 h92391eb_0 conda-forge
poppler-data 0.4.11 hd8ed1ab_0 conda-forge
postgresql 15.1 hdeef612_0 conda-forge
proj 9.1.0 h93bde94_0 conda-forge
prometheus_client 0.15.0 pyhd8ed1ab_0 conda-forge
prompt-toolkit 3.0.33 pyha770c72_0 conda-forge
protobuf 3.20.2 py39h5a03fae_1 conda-forge
psutil 5.9.4 py39hb9d737c_0 conda-forge
pthread-stubs 0.4 h36c2ea0_1001 conda-forge
ptxcompiler 0.7.0 py39h1eff087_2 conda-forge
ptyprocess 0.7.0 pyhd3deb0d_0 conda-forge
py-xgboost 1.6.2dev.rapidsai22.12 cuda_11_py39_0 rapidsai-nightly
pyarrow 9.0.0 py39hc0775d8_2_cpu conda-forge
pycparser 2.21 pyhd8ed1ab_0 conda-forge
pyct 0.4.6 py_0 conda-forge
pyct-core 0.4.6 py_0 conda-forge
pydeck 0.5.0 pyh9f0ad1d_0 conda-forge
pyee 8.1.0 pyhd8ed1ab_0 conda-forge
pygments 2.13.0 pyhd8ed1ab_0 conda-forge
pylibcugraph 22.12.00a221130 cuda11_py39_g60f5e7bc_145 rapidsai-nightly
pylibraft 22.12.00a221129 cuda11_py39_g11c5105_136 rapidsai-nightly
pynndescent 0.5.8 pyh1a96a4e_0 conda-forge
pynvml 11.4.1 pyhd8ed1ab_0 conda-forge
pyopenssl 22.1.0 pyhd8ed1ab_0 conda-forge
pyparsing 3.0.9 pyhd8ed1ab_0 conda-forge
pyppeteer 1.0.2 pyhd8ed1ab_0 conda-forge
pyproj 3.4.0 py39h14a8356_2 conda-forge
pyrsistent 0.19.2 py39hb9d737c_0 conda-forge
pysocks 1.7.1 pyha2e5f31_6 conda-forge
pytest 7.2.0 pyhd8ed1ab_2 conda-forge
python 3.9.15 h47a2c10_0_cpython conda-forge
python-confluent-kafka 1.7.0 py39h3811e60_2 conda-forge
python-dateutil 2.8.2 pyhd8ed1ab_0 conda-forge
python-fastjsonschema 2.16.2 pyhd8ed1ab_0 conda-forge
python_abi 3.9 3_cp39 conda-forge
pytz 2022.6 pyhd8ed1ab_0 conda-forge
pyviz_comms 2.2.1 pyhd8ed1ab_1 conda-forge
pywavelets 1.3.0 py39h2ae25f5_2 conda-forge
pyyaml 6.0 py39hb9d737c_5 conda-forge
pyzmq 24.0.1 py39headdf64_1 conda-forge
raft-dask 22.12.00a221129 cuda11_py39_g11c5105_136 rapidsai-nightly
rapids 22.12.00a221130 cuda11_py39_g7b4a0ef_51 rapidsai-nightly
rapids-xgboost 22.12.00a221130 cuda11_py39_g7b4a0ef_51 rapidsai-nightly
re2 2022.06.01 h27087fc_1 conda-forge
readline 8.1.2 h0f457ee_0 conda-forge
requests 2.28.1 pyhd8ed1ab_1 conda-forge
rmm 22.12.00a221130 cuda11_py39_gda7036aa_57 rapidsai-nightly
rtree 1.0.1 py39hb102c33_1 conda-forge
s2n 1.0.10 h9b69904_0 conda-forge
s3fs 2022.11.0 pyhd8ed1ab_0 conda-forge
scikit-image 0.19.3 py39h4661b88_2 conda-forge
scikit-learn 0.24.2 py39h7c5d8c9_1 conda-forge
scipy 1.6.0 py39hee8e79c_0 conda-forge
seaborn 0.12.1 hd8ed1ab_0 conda-forge
seaborn-base 0.12.1 pyhd8ed1ab_0 conda-forge
send2trash 1.8.0 pyhd8ed1ab_0 conda-forge
setuptools 60.10.0 py39hf3d152e_0 conda-forge
shapely 1.8.5 py39h76a96b7_2 conda-forge
simpervisor 0.4 pyhd8ed1ab_0 conda-forge
six 1.16.0 pyh6c4a22f_0 conda-forge
snappy 1.1.9 hbd366e4_2 conda-forge
sniffio 1.3.0 pyhd8ed1ab_0 conda-forge
sortedcontainers 2.4.0 pyhd8ed1ab_0 conda-forge
soupsieve 2.3.2.post1 pyhd8ed1ab_0 conda-forge
spdlog 1.8.5 h4bd325d_1 conda-forge
sqlite 3.40.0 h4ff8645_0 conda-forge
statsmodels 0.13.5 py39h2ae25f5_2 conda-forge
streamz 0.6.4 pyh6c4a22f_0 conda-forge
tbb 2021.7.0 h924138e_0 conda-forge
tblib 1.7.0 pyhd8ed1ab_0 conda-forge
terminado 0.17.0 pyh41d4057_0 conda-forge
threadpoolctl 3.1.0 pyh8a188c0_0 conda-forge
tifffile 2022.10.10 pyhd8ed1ab_0 conda-forge
tiledb 2.11.3 h1e4a385_1 conda-forge
tinycss2 1.2.1 pyhd8ed1ab_0 conda-forge
tk 8.6.12 h27826a3_0 conda-forge
tomli 2.0.1 pyhd8ed1ab_0 conda-forge
toolz 0.12.0 pyhd8ed1ab_0 conda-forge
tornado 6.1 py39hb9d737c_3 conda-forge
tqdm 4.64.1 pyhd8ed1ab_0 conda-forge
traitlets 5.5.0 pyhd8ed1ab_0 conda-forge
treelite 3.0.0 py39hc7ff369_1 conda-forge
treelite-runtime 3.0.0 pypi_0 pypi
typing-extensions 4.4.0 hd8ed1ab_0 conda-forge
typing_extensions 4.4.0 pyha770c72_0 conda-forge
tzcode 2022g h166bdaf_0 conda-forge
tzdata 2022f h191b570_0 conda-forge
ucx 1.13.1 h538f049_0 conda-forge
ucx-proc 1.0.0 gpu rapidsai
ucx-py 0.29.00a221129 py39_g707b335_22 rapidsai-nightly
umap-learn 0.5.3 py39hf3d152e_0 conda-forge
unicodedata2 15.0.0 py39hb9d737c_0 conda-forge
urllib3 1.26.13 pyhd8ed1ab_0 conda-forge
wcwidth 0.2.5 pyh9f0ad1d_2 conda-forge
webencodings 0.5.1 py_1 conda-forge
websocket-client 1.4.2 pyhd8ed1ab_0 conda-forge
websockets 10.4 py39hb9d737c_1 conda-forge
wheel 0.38.4 pyhd8ed1ab_0 conda-forge
widgetsnbextension 4.0.3 pyhd8ed1ab_0 conda-forge
wrapt 1.14.1 py39hb9d737c_1 conda-forge
xarray 2022.11.0 pyhd8ed1ab_0 conda-forge
xerces-c 3.2.4 h55805fa_1 conda-forge
xgboost 1.6.2dev.rapidsai22.12 cuda_11_py39_0 rapidsai-nightly
xorg-kbproto 1.0.7 h7f98852_1002 conda-forge
xorg-libice 1.0.10 h7f98852_0 conda-forge
xorg-libsm 1.2.3 hd9c2040_1000 conda-forge
xorg-libx11 1.7.2 h7f98852_0 conda-forge
xorg-libxau 1.0.9 h7f98852_0 conda-forge
xorg-libxdmcp 1.1.3 h7f98852_0 conda-forge
xorg-libxext 1.3.4 h7f98852_1 conda-forge
xorg-libxrender 0.9.10 h7f98852_1003 conda-forge
xorg-renderproto 0.11.1 h7f98852_1002 conda-forge
xorg-xextproto 7.3.0 h7f98852_1002 conda-forge
xorg-xproto 7.0.31 h7f98852_1007 conda-forge
xyzservices 2022.9.0 pyhd8ed1ab_0 conda-forge
xz 5.2.6 h166bdaf_0 conda-forge
yaml 0.2.5 h7f98852_2 conda-forge
yarl 1.8.1 py39hb9d737c_0 conda-forge
zeromq 4.3.4 h9c3ff4c_1 conda-forge
zfp 1.0.0 h27087fc_3 conda-forge
zict 2.2.0 pyhd8ed1ab_0 conda-forge
zipp 3.11.0 pyhd8ed1ab_0 conda-forge
zlib 1.2.13 h166bdaf_4 conda-forge
zlib-ng 2.0.6 h166bdaf_0 conda-forge
zstd 1.5.2 h6239696_4 conda-forge
Hello,
I am trying to build a K nearest neighbors graph from 7M vectors of 768 dimensions with 8 A500 GPUs (24 GB of RAM each), but face issue implementing it.
Running cuml 22.12.00a+49.g07f0bc4ba, I start from DataFrame of size 7079327 x 768, and compute some chunk size:
or
884916
. Then I create a cupy array:I fit a model:
which is completed in 1min 28. Finally I compute the graph with
The GPUs run for a few minutes with high % RAM and compute usage, then become completely idle and CPUs take over for hours, remaining in sleep state most of the time (mostly S, occasionally D, rarely R state; I killed the job before finding out how many hours this lasted).
I tried various chunk and batch sizes, but it did not really helped.
First, is there a better way to extract the graph from the fitted model beside calling kneighbors()? Second, if not, can anyone provide guidance on how to set the chunk and batch sizes or set up the calculation, to reduce the final CPU bottleneck? (Single GPU nearest neighbors is fast enough, but the data is too big for 24GB)
The cuml API reference doc did not help solve the problem.