BiomedicalMachineLearning / stLearn

A novel machine learning pipeline to analyse spatial transcriptomics data
Other
200 stars 26 forks source link

TypingError: Failed in nopython mode pipeline (step: nopython frontend) #225

Closed jmzhang1911 closed 1 year ago

jmzhang1911 commented 1 year ago

Hi,

Thank you very much for developing such excellent tools. However, I encountered an error while using them. Could you tell me what could be the reason for this?

Thanks!

image
Counting celltype-celltype interactions per LR and permutating 1000 times.:   0%|           [ time left: ? ]
---------------------------------------------------------------------------
TypingError                               Traceback (most recent call last)
Cell In[17], line 2
      1 # Running the counting of co-occurence of cell types and LR expression hotspots #
----> 2 st.tl.cci.run_cci(st_SM35, 'cell_type_interface', # Spot cell information either in data.obs or data.uns
      3                   min_spots=3, # Minimum number of spots for LR to be tested.
      4                   spot_mixtures=True, # If True will use the label transfer scores,
      5                                       # so spots can have multiple cell types if score>cell_prop_cutoff
      6                   cell_prop_cutoff=0.2, # Spot considered to have cell type if score>0.2
      7                   sig_spots=True, # Only consider neighbourhoods of spots which had significant LR scores.
      8                   n_perms=1000, # Permutations of cell information to get background, recommend ~100
      9                  )

File /share/nas1/zhangjm/software/miniconda3/envs/stlearn/lib/python3.8/site-packages/stlearn/tools/microenv/cci/analysis.py:695, in run_cci(adata, use_label, spot_mixtures, min_spots, sig_spots, cell_prop_cutoff, p_cutoff, n_perms, n_cpus, verbose)
    692 lr_index = np.where(adata.uns["lr_summary"].index.values == best_lr)[0][0]
    693 sig_bool = adata.obsm[col][:, lr_index] > 0
--> 695 int_matrix = get_interaction_matrix(
    696     cell_data,
    697     neighbourhood_bcs,
    698     neighbourhood_indices,
    699     all_set,
    700     sig_bool,
    701     L_bool,
    702     R_bool,
    703     cell_prop_cutoff,
    704 ).astype(int)
    706 if n_perms > 0:
    707     int_pvals = get_interaction_pvals(
    708         int_matrix,
    709         n_perms,
   (...)
    717         cell_prop_cutoff,
    718     )

File /share/nas1/zhangjm/software/miniconda3/envs/stlearn/lib/python3.8/site-packages/numba/core/dispatcher.py:420, in _DispatcherBase._compile_for_args(self, *args, **kws)
    414         msg = str(e).rstrip() + (
    415             "\n\nThis error may have been caused by the following argument(s):\n%s\n"
    416             % "\n".join("- argument %d: %s" % (i, err)
    417                         for i, err in failed_args))
    418         e.patch_message(msg)
--> 420     error_rewrite(e, 'typing')
    421 except errors.UnsupportedError as e:
    422     # Something unsupported is present in the user code, add help info
    423     error_rewrite(e, 'unsupported_error')

File /share/nas1/zhangjm/software/miniconda3/envs/stlearn/lib/python3.8/site-packages/numba/core/dispatcher.py:361, in _DispatcherBase._compile_for_args.<locals>.error_rewrite(e, issue_type)
    359     raise e
    360 else:
--> 361     raise e.with_traceback(None)

TypingError: Failed in nopython mode pipeline (step: nopython frontend)
non-precise type array(pyobject, 1d, C)
During: typing of argument at /share/nas1/zhangjm/software/miniconda3/envs/stlearn/lib/python3.8/site-packages/stlearn/tools/microenv/cci/het.py (270)

File "../../../../../software/miniconda3/envs/stlearn/lib/python3.8/site-packages/stlearn/tools/microenv/cci/het.py", line 270:
def get_interaction_matrix(
    <source elided>
    # (if bidirectional interaction between two spots, counts as two seperate interactions).
    LR_edges = get_interactions(
    ^
Package                      Version
---------------------------- ----------
absl-py                      1.3.0
anndata                      0.8.0
asttokens                    2.2.1
astunparse                   1.6.3
backcall                     0.2.0
bokeh                        3.0.3
cachetools                   5.2.1
certifi                      2022.12.7
charset-normalizer           2.1.1
click                        8.1.3
comm                         0.1.2
contourpy                    1.0.6
cycler                       0.11.0
debugpy                      1.6.6
decorator                    5.1.1
executing                    1.2.0
Flask                        2.2.2
Flask-WTF                    1.0.1
flatbuffers                  23.1.4
fonttools                    4.38.0
gast                         0.4.0
google-auth                  2.16.0
google-auth-oauthlib         0.4.6
google-pasta                 0.2.0
grpcio                       1.51.1
h5py                         3.7.0
idna                         3.4
igraph                       0.10.3
imageio                      2.24.0
importlib-metadata           6.0.0
ipykernel                    6.21.3
ipython                      8.11.0
itsdangerous                 2.1.2
jedi                         0.18.2
Jinja2                       3.1.2
joblib                       1.2.0
jupyter_client               8.0.3
jupyter_core                 5.2.0
keras                        2.11.0
kiwisolver                   1.4.4
leidenalg                    0.9.1
libclang                     15.0.6.1
llvmlite                     0.36.0
louvain                      0.8.0
Markdown                     3.4.1
MarkupSafe                   2.1.1
matplotlib                   3.6.2
matplotlib-inline            0.1.6
natsort                      8.2.0
nest-asyncio                 1.5.6
networkx                     3.0
numba                        0.53.0
numpy                        1.21.6
oauthlib                     3.2.2
opt-einsum                   3.3.0
packaging                    23.0
pandas                       1.5.2
parso                        0.8.3
patsy                        0.5.3
pexpect                      4.8.0
pickleshare                  0.7.5
Pillow                       9.4.0
pip                          22.3.1
platformdirs                 3.1.0
prompt-toolkit               3.0.38
protobuf                     3.19.6
psutil                       5.9.4
ptyprocess                   0.7.0
pure-eval                    0.2.2
pyasn1                       0.4.8
pyasn1-modules               0.2.8
Pygments                     2.14.0
pynndescent                  0.5.8
pyparsing                    3.0.9
python-dateutil              2.8.2
pytz                         2022.7
PyWavelets                   1.4.1
PyYAML                       6.0
pyzmq                        25.0.0
requests                     2.28.1
requests-oauthlib            1.3.1
rsa                          4.9
scanpy                       1.9.1
scikit-image                 0.19.3
scikit-learn                 1.2.0
scipy                        1.10.0
seaborn                      0.12.2
session-info                 1.0.0
setuptools                   65.6.3
six                          1.16.0
stack-data                   0.6.2
statsmodels                  0.13.5
stdlib-list                  0.8.0
stlearn                      0.4.11
tensorboard                  2.11.0
tensorboard-data-server      0.6.1
tensorboard-plugin-wit       1.8.1
tensorflow                   2.11.0
tensorflow-estimator         2.11.0
tensorflow-io-gcs-filesystem 0.29.0
termcolor                    2.2.0
texttable                    1.6.7
threadpoolctl                3.1.0
tifffile                     2022.10.10
tornado                      6.2
tqdm                         4.64.1
traitlets                    5.9.0
typing_extensions            4.4.0
umap-learn                   0.5.3
urllib3                      1.26.13
wcwidth                      0.2.6
Werkzeug                     2.2.2
wheel                        0.37.1
wrapt                        1.14.1
WTForms                      3.0.1
xyzservices                  2022.9.0
zipp                         3.11.0
duypham2108 commented 1 year ago

Can you try to update numba?

I just test it and it works fine

Package                 Version
----------------------- -----------
absl-py                 1.4.0
aiohttp                 3.8.4
aiosignal               1.3.1
anndata                 0.8.0
anyio                   3.5.0
argon2-cffi             21.3.0
argon2-cffi-bindings    21.2.0
asttokens               2.0.5
astunparse              1.6.3
async-timeout           4.0.2
attrs                   22.2.0
Babel                   2.11.0
backcall                0.2.0
beautifulsoup4          4.11.1
bleach                  4.1.0
blinker                 1.5
bokeh                   3.0.3
brotlipy                0.7.0
cached-property         1.5.2
cachetools              5.3.0
certifi                 2022.12.7
cffi                    1.15.1
charset-normalizer      2.1.1
click                   8.1.3
cloudpickle             2.2.1
colorama                0.4.6
comm                    0.1.2
contourpy               1.0.7
cryptography            38.0.4
cycler                  0.11.0
cytoolz                 0.12.0
dask                    2023.3.0
debugpy                 1.5.1
decorator               5.1.1
defusedxml              0.7.1
entrypoints             0.4
executing               0.8.3
fastjsonschema          2.16.2
flatbuffers             23.1.21
fonttools               4.39.0
frozenlist              1.3.3
fsspec                  2023.3.0
gast                    0.4.0
google-auth             2.16.2
google-auth-oauthlib    0.4.6
google-pasta            0.2.0
grpcio                  1.42.0
h5py                    3.8.0
idna                    3.4
igraph                  0.10.4
imagecodecs             2023.1.23
imageio                 2.26.0
importlib-metadata      6.0.0
importlib-resources     5.12.0
ipykernel               6.19.2
ipython                 8.10.0
ipython-genutils        0.2.0
jedi                    0.18.1
Jinja2                  3.0.3
joblib                  1.2.0
json5                   0.9.6
jsonschema              4.17.3
jupyter_client          7.4.9
jupyter_core            5.2.0
jupyter-server          1.23.4
jupyterlab              3.5.3
jupyterlab-pygments     0.1.2
jupyterlab_server       2.19.0
keras                   2.10.0
Keras-Preprocessing     1.1.2
kiwisolver              1.4.4
leidenalg               0.9.1
llvmlite                0.39.1
locket                  1.0.0
louvain                 0.8.0
Markdown                3.4.1
MarkupSafe              2.1.2
matplotlib              3.7.1
matplotlib-inline       0.1.6
mistune                 0.8.4
multidict               6.0.4
munkres                 1.1.4
natsort                 8.3.1
nbclassic               0.5.2
nbclient                0.5.13
nbconvert               6.4.4
nbformat                5.7.0
nest-asyncio            1.5.6
networkx                3.0
notebook                6.5.2
notebook_shim           0.2.2
numba                   0.56.4
numpy                   1.21.6
oauthlib                3.2.2
opt-einsum              3.3.0
packaging               23.0
pandas                  1.5.3
pandocfilters           1.5.0
parso                   0.8.3
partd                   1.3.0
patsy                   0.5.3
pickleshare             0.7.5
Pillow                  9.4.0
pip                     23.0.1
pkgutil_resolve_name    1.3.10
platformdirs            3.1.0
pooch                   1.7.0
prometheus-client       0.14.1
prompt-toolkit          3.0.36
protobuf                3.20.2
psutil                  5.9.0
pure-eval               0.2.2
pyasn1                  0.4.8
pyasn1-modules          0.2.7
pycparser               2.21
Pygments                2.11.2
PyJWT                   2.6.0
pynndescent             0.5.8
pyOpenSSL               23.0.0
pyparsing               3.0.9
pyrsistent              0.18.0
PySocks                 1.7.1
python-dateutil         2.8.2
pytz                    2022.7.1
pyu2f                   0.1.5
PyWavelets              1.4.1
pywin32                 305.1
pywinpty                2.0.10
PyYAML                  6.0
pyzmq                   23.2.0
requests                2.28.2
requests-oauthlib       1.3.1
rsa                     4.9
scanpy                  1.9.3
scikit-image            0.19.3
scikit-learn            1.2.1
scipy                   1.10.1
seaborn                 0.12.2
Send2Trash              1.8.0
session-info            1.0.0
setuptools              65.6.3
six                     1.16.0
sniffio                 1.2.0
soupsieve               2.3.2.post1
stack-data              0.2.0
statsmodels             0.13.5
stdlib-list             0.8.0
stlearn                 0.4.11
tensorboard             2.10.0
tensorboard-data-server 0.6.1
tensorboard-plugin-wit  1.8.1
tensorflow              2.10.0
tensorflow-estimator    2.10.0
termcolor               2.2.0
terminado               0.17.1
testpath                0.6.0
texttable               1.6.7
threadpoolctl           3.1.0
tifffile                2023.2.28
tomli                   2.0.1
toolz                   0.12.0
tornado                 6.2
tqdm                    4.65.0
traitlets               5.7.1
typing_extensions       4.4.0
umap-learn              0.5.3
unicodedata2            15.0.0
urllib3                 1.26.14
wcwidth                 0.2.5
webencodings            0.5.1
websocket-client        0.58.0
Werkzeug                2.2.3
wheel                   0.38.4
win-inet-pton           1.1.0
wincertstore            0.2
wrapt                   1.15.0
xlrd                    1.2.0
xyzservices             2023.2.0
yarl                    1.8.2
zipp                    3.15.0
jmzhang1911 commented 1 year ago
image image

I upgraded numba to version 0.56.4, but I am still getting the same error message.

duypham2108 commented 1 year ago

Normally, it's related to the numba or llvmlite library issue. You can try to upgrade llvmlite too.

jmzhang1911 commented 1 year ago
image

Yes, I am using llvmlite==0.39.1, but still getting the same error.

duypham2108 commented 1 year ago

Can you reinstall stlearn with a new conda env and conda-forge. I just uploaded a new 0.4.12 version few minutes ago

jmzhang1911 commented 1 year ago

Ok, thank you so much!

jmzhang1911 commented 1 year ago

Can you reinstall stlearn with a new conda env and conda-forge. I just uploaded a new 0.4.12 version few minutes ago

I created a new environment using conda and installed the latest stLearn with pip, but I still got the same error.

In [6]: st.__version__
Out[6]: '0.4.12'

In [10]: numba.__version__
Out[10]: '0.56.4'

In [11]: llvmlite.__version__
Out[11]: '0.39.1'

In [9]: sys.path
Out[9]:
['/share/nas1/zhangjm/software/miniconda3/envs/stlearn_v2/bin',
 '/share/nas1/zhangjm/software/miniconda3/envs/stlearn_v2/lib/python38.zip',
 '/share/nas1/zhangjm/software/miniconda3/envs/stlearn_v2/lib/python3.8',
 '/share/nas1/zhangjm/software/miniconda3/envs/stlearn_v2/lib/python3.8/lib-dynload',
 '',
 '/share/nas1/zhangjm/software/miniconda3/envs/stlearn_v2/lib/python3.8/site-packages']
jmzhang1911 commented 1 year ago

Oh sorry, I received another error message.


In [9]: st.tl.cci.run_cci(st_SM35, 'cell_type', # Spot cell information either in data.obs or data.uns
...:                   min_spots=3, # Minimum number of spots for LR to be tested.
...:                   spot_mixtures=True, # If True will use the label transfer scores,
...:                                       # so spots can have multiple cell types if score>cell_prop_cutoff
...:                   cell_prop_cutoff=0.2, # Spot considered to have cell type if score>0.2
...:                   sig_spots=True, # Only consider neighbourhoods of spots which had significant LR scores
...: .
...:                   n_perms=100 # Permutations of cell information to get background, recommend ~1000
...:                  )
Warning: specified spot_mixtures but no deconvolution data in adata.uns['cell_type'].
Falling back to discrete mode.
Getting cached neighbourhood information...
<class 'numpy.ndarray'>
<class 'numpy.ndarray'>
Getting information for CCI counting...
<class 'numpy.ndarray'>
<class 'numpy.ndarray'>
---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
Cell In[9], line 1
----> 1 st.tl.cci.run_cci(st_SM35, 'cell_type', # Spot cell information either in data.obs or data.uns
2                   min_spots=3, # Minimum number of spots for LR to be tested.
3                   spot_mixtures=True, # If True will use the label transfer scores,
4                                       # so spots can have multiple cell types if score>cell_prop_cutoff
5                   cell_prop_cutoff=0.2, # Spot considered to have cell type if score>0.2
6                   sig_spots=True, # Only consider neighbourhoods of spots which had significant LR scores.
7                   n_perms=100 # Permutations of cell information to get background, recommend ~1000
8                  )

File /share/nas1/zhangjm/software/miniconda3/envs/stlearn_v2/lib/python3.8/site-packages/stlearn/tools/microenv/cci/analysis.py:642, in run_cci(adata, use_label, spot_mixtures, min_spots, sig_spots, cell_prop_cutoff, p_cutoff, n_perms, verbose) 638 msg = msg + "Rows do not correspond to adata.obs_names.\n" 639 raise Exception(msg) 641 #### Checking for case where have cell types that are never dominant --> 642 #### in a spot, so need to include these in all_set 643 if len(all_set) < adata.uns[uns_key].shape[1]: 644 all_set = adata.uns[uns_key].columns.values.astype(str)

ValueError: too many values to unpack (expected 2)

duypham2108 commented 1 year ago

Have you tried to input the deconvolution result to data.uns['cell_type']?

Please check this part: https://stlearn.readthedocs.io/en/latest/tutorials/stLearn-CCI.html#Data-Loading-&-Preprocessing

jmzhang1911 commented 1 year ago

I didn't have deconvolution results, so I used "cell_type" in st_SM35.obs, but it didn't work.

image image
# Running the counting of co-occurence of cell types and LR expression hotspots #
st.tl.cci.run_cci(st_SM35, 'cell_type', # Spot cell information either in data.obs or data.uns
                  min_spots=3, # Minimum number of spots for LR to be tested.
                  spot_mixtures=False, # If True will use the label transfer scores,
                                      # so spots can have multiple cell types if score>cell_prop_cutoff
                  cell_prop_cutoff=0.2, # Spot considered to have cell type if score>0.2
                  sig_spots=True, # Only consider neighbourhoods of spots which had significant LR scores.
                  n_perms=1000 # Permutations of cell information to get background, recommend ~100
                 )

Getting cached neighbourhood information...
Getting information for CCI counting...
Counting celltype-celltype interactions per LR and permutating 1000 times.:   0%|           [ time left: ? ]
---------------------------------------------------------------------------
TypingError                               Traceback (most recent call last)
Cell In[15], line 2
      1 # Running the counting of co-occurence of cell types and LR expression hotspots #
----> 2 st.tl.cci.run_cci(st_SM35, 'cell_type', # Spot cell information either in data.obs or data.uns
      3                   min_spots=3, # Minimum number of spots for LR to be tested.
      4                   spot_mixtures=False, # If True will use the label transfer scores,
      5                                       # so spots can have multiple cell types if score>cell_prop_cutoff
      6                   cell_prop_cutoff=0.2, # Spot considered to have cell type if score>0.2
      7                   sig_spots=True, # Only consider neighbourhoods of spots which had significant LR scores.
      8                   n_perms=1000 # Permutations of cell information to get background, recommend ~100
      9                  )

File /share/nas1/zhangjm/software/miniconda3/envs/stlearn_v2/lib/python3.8/site-packages/stlearn/tools/microenv/cci/analysis.py:695, in run_cci(adata, use_label, spot_mixtures, min_spots, sig_spots, cell_prop_cutoff, p_cutoff, n_perms, n_cpus, verbose)
    692 lr_index = np.where(adata.uns["lr_summary"].index.values == best_lr)[0][0]
    693 sig_bool = adata.obsm[col][:, lr_index] > 0
--> 695 int_matrix = get_interaction_matrix(
    696     cell_data,
    697     neighbourhood_bcs,
    698     neighbourhood_indices,
    699     all_set,
    700     sig_bool,
    701     L_bool,
    702     R_bool,
    703     cell_prop_cutoff,
    704 ).astype(int)
    706 if n_perms > 0:
    707     int_pvals = get_interaction_pvals(
    708         int_matrix,
    709         n_perms,
   (...)
    717         cell_prop_cutoff,
    718     )

File /share/nas1/zhangjm/software/miniconda3/envs/stlearn_v2/lib/python3.8/site-packages/numba/core/dispatcher.py:468, in _DispatcherBase._compile_for_args(self, *args, **kws)
    464         msg = (f"{str(e).rstrip()} \n\nThis error may have been caused "
    465                f"by the following argument(s):\n{args_str}\n")
    466         e.patch_message(msg)
--> 468     error_rewrite(e, 'typing')
    469 except errors.UnsupportedError as e:
    470     # Something unsupported is present in the user code, add help info
    471     error_rewrite(e, 'unsupported_error')

File /share/nas1/zhangjm/software/miniconda3/envs/stlearn_v2/lib/python3.8/site-packages/numba/core/dispatcher.py:409, in _DispatcherBase._compile_for_args.<locals>.error_rewrite(e, issue_type)
    407     raise e
    408 else:
--> 409     raise e.with_traceback(None)

TypingError: Failed in nopython mode pipeline (step: nopython frontend)
non-precise type array(pyobject, 1d, C)
During: typing of argument at /share/nas1/zhangjm/software/miniconda3/envs/stlearn_v2/lib/python3.8/site-packages/stlearn/tools/microenv/cci/het.py (270)

File "../../../../../software/miniconda3/envs/stlearn_v2/lib/python3.8/site-packages/stlearn/tools/microenv/cci/het.py", line 270:
def get_interaction_matrix(
    <source elided>
    # (if bidirectional interaction between two spots, counts as two seperate interactions).
    LR_edges = get_interactions(
    ^
duypham2108 commented 1 year ago

Sorry that I haven't tested for this case. I need to check it. For now, can you please try to input a fake deconvolution result matrix with 1 if they have that label and 0 for the rest?

jmzhang1911 commented 1 year ago

I added a deconvolution matrix, but it's still throwing an error.

image image
duypham2108 commented 1 year ago

It's difficult for me to debug it. I have some questions:

jmzhang1911 commented 1 year ago

I found that the reason for the error was because the clustering data contained NaN values. After removing them and re-running the analysis, I obtained the final result. Thank you for developing such an excellent analysis tool, and thank you for your patient assistance. I am very grateful.