Closed nickeener closed 2 years ago
Currently passes the flake8 and mypy tests but the fast-test produces an error that I'm not sure how to address:
pytest -v -n 8 --cov starfish -m 'not (slow or napari)'
ERROR: usage: pytest [options] [file_or_dir] [file_or_dir] [...]
pytest: error: unrecognized arguments: -n --cov starfish
inifile: None
rootdir: /home/nick/projects/spacetx/decoder/starfish
There is also an error running the make -C docs html
command that appears to be a versioning problem with Sphinx:
Running Sphinx v3.5.3
/home/nick/projects/spacetx/decoder/starfish/docs/source/conf.py:160: RemovedInSphinx40Warning: The app.add_stylesheet() is deprecated. Please use app.add_css_file() instead.
app.add_stylesheet("my-styles.css")
generating gallery...
Configuration error:
Unknown key(s) in sphinx_gallery_conf:
'download_section_examples', did you mean 'download_all_examples'?
make[1]: *** [Makefile:25: html] Error 2
make[1]: Leaving directory '/home/nick/projects/spacetx/decoder/starfish/docs'
make: *** [Makefile:72: docs-html] Error 2
I could use some help addressing these problems so that we can get these new features added to starFISH. Thanks!
@nickeener this is likely to be related to the antique python used in travis CI right now. can you give a pip freeze for your current environment where you're running into these problems?
also, I'm excited to see some new decoding algorithms- thanks for contributing!
@berl here's the output from pip freeze:
absl-py==0.14.1 aiohttp==3.7.4.post0 aiohttp-cors==0.7.0 aioredis==1.3.1 alabaster==0.7.12 anaconda-client @ file:///home/conda/feedstock_root/build_artifacts/anaconda-client_1619451397123/work anaconda-navigator==2.0.1 anaconda-project @ file:///home/conda/feedstock_root/build_artifacts/anaconda-project_1610202457240/work anndata==0.7.5 annoy==1.17.0 anyio @ file:///home/conda/feedstock_root/build_artifacts/anyio_1614388751160/work/dist appdirs @ file:///home/conda/feedstock_root/build_artifacts/appdirs_1603108395799/work argh @ file:///home/conda/feedstock_root/build_artifacts/argh_1595627874344/work argon2-cffi @ file:///home/conda/feedstock_root/build_artifacts/argon2-cffi_1610522574055/work arrow @ file:///home/conda/feedstock_root/build_artifacts/arrow_1619571844960/work asn1crypto @ file:///home/conda/feedstock_root/build_artifacts/asn1crypto_1595949944546/work astroid @ file:///home/conda/feedstock_root/build_artifacts/astroid_1619355087159/work astropy @ file:///home/conda/feedstock_root/build_artifacts/astropy_1617305418693/work astunparse==1.6.3 async-generator==1.10 async-timeout==3.0.1 atomicwrites @ file:///home/conda/feedstock_root/build_artifacts/atomicwrites_1588182545583/work attrs @ file:///home/conda/feedstock_root/build_artifacts/attrs_1605083924122/work autopep8 @ file:///home/conda/feedstock_root/build_artifacts/autopep8_1615918605177/work Babel @ file:///home/conda/feedstock_root/build_artifacts/babel_1619719576210/work backcall @ file:///home/conda/feedstock_root/build_artifacts/backcall_1592338393461/work backports.functools-lru-cache @ file:///home/conda/feedstock_root/build_artifacts/backports.functools_lru_cache_1618230623929/work backports.shutil-get-terminal-size==1.0.0 beautifulsoup4 @ file:///home/conda/feedstock_root/build_artifacts/beautifulsoup4_1601745390275/work binaryornot==0.4.4 biopython==1.78 bioservices==1.7.11 biothings-client==0.2.6 bitarray @ file:///home/conda/feedstock_root/build_artifacts/bitarray_1619012700560/work bkcharts==0.2 black @ file:///home/conda/feedstock_root/build_artifacts/black-recipe_1619703601174/work bleach @ file:///home/conda/feedstock_root/build_artifacts/bleach_1612213472466/work blessings==1.7 bokeh @ file:///home/conda/feedstock_root/build_artifacts/bokeh_1617882739326/work boto==2.49.0 boto3==1.17.41 botocore==1.20.41 Bottleneck @ file:///home/conda/feedstock_root/build_artifacts/bottleneck_1611195606760/work brotlipy==0.7.0 cached-property @ file:///home/conda/feedstock_root/build_artifacts/cached_property_1615209429212/work cachetools==4.2.2 cachey==0.2.1 certifi==2021.5.30 cffi @ file:///home/conda/feedstock_root/build_artifacts/cffi_1613413861439/work chardet @ file:///home/conda/feedstock_root/build_artifacts/chardet_1610093490430/work clang==5.0 click==7.1.2 cloudpickle @ file:///home/conda/feedstock_root/build_artifacts/cloudpickle_1598400192773/work clyent==1.2.2 colorama @ file:///home/conda/feedstock_root/build_artifacts/colorama_1602866480661/work colorful==0.5.4 colorlog==4.8.0 colormath==3.0.0 conda==4.10.3 conda-build==3.21.4 conda-package-handling @ file:///home/conda/feedstock_root/build_artifacts/conda-package-handling_1618231394280/work conda-repo-cli @ file:///tmp/build/80754af9/conda-repo-cli_1611347565866/work conda-token @ file:///tmp/build/80754af9/conda-token_1613597706833/work conda-verify @ file:///home/conda/feedstock_root/build_artifacts/conda-verify_1610223823319/work contextlib2==0.6.0.post1 cookiecutter==1.7.2 cryptography @ file:///home/conda/feedstock_root/build_artifacts/cryptography_1616851476134/work cycler==0.10.0 Cython @ file:///home/conda/feedstock_root/build_artifacts/cython_1618445283157/work cytoolz==0.11.0 dask @ file:///home/conda/feedstock_root/build_artifacts/dask-core_1619215945610/work dataclasses==0.6 decorator==4.4.2 defusedxml @ file:///home/conda/feedstock_root/build_artifacts/defusedxml_1615232257335/work deprecation @ file:///home/conda/feedstock_root/build_artifacts/deprecation_1589881437857/work desc==2.1.1 diff-match-patch @ file:///home/conda/feedstock_root/build_artifacts/diff-match-patch_1594679019945/work diskcache==5.2.1 distributed @ file:///home/conda/feedstock_root/build_artifacts/distributed_1619218690584/work docstring-parser==0.7.3 docutils @ file:///home/conda/feedstock_root/build_artifacts/docutils_1618676255808/work easydev==0.11.0 entrypoints @ file:///home/conda/feedstock_root/build_artifacts/entrypoints_1605121927639/work/dist/entrypoints-0.3-py2.py3-none-any.whl et-xmlfile==1.0.1 faiss==1.7.1 fastcache @ file:///home/conda/feedstock_root/build_artifacts/fastcache_1610325862128/work fastdist==1.1.3 fbpca==1.0 filelock @ file:///home/conda/feedstock_root/build_artifacts/filelock_1589994591731/work flake8 @ file:///home/conda/feedstock_root/build_artifacts/flake8_1601874335748/work Flask==1.1.2 flatbuffers==1.12 freetype-py==2.2.0 fsspec @ file:///home/conda/feedstock_root/build_artifacts/fsspec_1618779244974/work future @ file:///home/conda/feedstock_root/build_artifacts/future_1610147327521/work gast==0.4.0 geosketch==1.2 get_version==2.1 gevent @ file:///home/conda/feedstock_root/build_artifacts/gevent_1611197824320/work glob2==0.7 gmpy2==2.1.0b1 google==3.0.0 google-api-core==2.0.1 google-auth==1.35.0 google-auth-oauthlib==0.4.6 google-pasta==0.2.0 googleapis-common-protos==1.53.0 gpustat==0.6.0 greenlet @ file:///home/conda/feedstock_root/build_artifacts/greenlet_1615997404757/work grequests==0.6.0 grpcio==1.41.0 gseapy==0.10.4 h5py==3.1.0 harmonypy==0.0.5 HeapDict==1.0.1 hiredis==2.0.0 html5lib @ file:///home/conda/feedstock_root/build_artifacts/html5lib_1592930327044/work idna @ file:///home/conda/feedstock_root/build_artifacts/idna_1593328102638/work imagecodecs @ file:///home/conda/feedstock_root/build_artifacts/imagecodecs_1617408476261/work imageio @ file:///home/conda/feedstock_root/build_artifacts/imageio_1594044661732/work imagesize==1.2.0 importlib-metadata @ file:///home/conda/feedstock_root/build_artifacts/importlib-metadata_1619012197204/work inflection @ file:///home/conda/feedstock_root/build_artifacts/inflection_1598089801258/work iniconfig @ file:///home/conda/feedstock_root/build_artifacts/iniconfig_1603384189793/work intervaltree==2.1.0 ipykernel @ file:///home/conda/feedstock_root/build_artifacts/ipykernel_1617137062168/work/dist/ipykernel-5.5.3-py3-none-any.whl ipython @ file:///home/conda/feedstock_root/build_artifacts/ipython_1619827157034/work ipython-genutils==0.2.0 ipywidgets @ file:///home/conda/feedstock_root/build_artifacts/ipywidgets_1609995587151/work isort @ file:///home/conda/feedstock_root/build_artifacts/isort_1616314750031/work itsdangerous==1.1.0 jdcal==1.4.1 jedi @ file:///home/conda/feedstock_root/build_artifacts/jedi_1605054524035/work jeepney @ file:///home/conda/feedstock_root/build_artifacts/jeepney_1605811727098/work Jinja2 @ file:///home/conda/feedstock_root/build_artifacts/jinja2_1612119311452/work jinja2-time==0.2.0 jmespath==0.10.0 joblib @ file:///tmp/build/80754af9/joblib_1601912903842/work json-tricks==3.15.5 json5 @ file:///home/conda/feedstock_root/build_artifacts/json5_1600692310011/work jsonschema @ file:///home/conda/feedstock_root/build_artifacts/jsonschema_1614815863336/work jupyter @ file:///home/conda/feedstock_root/build_artifacts/jupyter_1611871900135/work jupyter-client @ file:///home/conda/feedstock_root/build_artifacts/jupyter_client_1615693636836/work jupyter-console @ file:///home/conda/feedstock_root/build_artifacts/jupyter_console_1616560109969/work jupyter-core @ file:///home/conda/feedstock_root/build_artifacts/jupyter_core_1612125275706/work jupyter-packaging @ file:///home/conda/feedstock_root/build_artifacts/jupyter-packaging_1618917196048/work jupyter-server @ file:///home/conda/feedstock_root/build_artifacts/jupyter_server_1619053241770/work jupyterlab @ file:///home/conda/feedstock_root/build_artifacts/jupyterlab_1618241231046/work jupyterlab-pygments @ file:///home/conda/feedstock_root/build_artifacts/jupyterlab_pygments_1601375948261/work jupyterlab-server @ file:///home/conda/feedstock_root/build_artifacts/jupyterlab_server_1619526846046/work jupyterlab-widgets @ file:///home/conda/feedstock_root/build_artifacts/jupyterlab_widgets_1609173350931/work keras==2.6.0 Keras-Preprocessing==1.1.2 keyring @ file:///home/conda/feedstock_root/build_artifacts/keyring_1616689217931/work kiwisolver @ file:///home/conda/feedstock_root/build_artifacts/kiwisolver_1610099769230/work lazy-object-proxy @ file:///home/conda/feedstock_root/build_artifacts/lazy-object-proxy_1616506784109/work legacy-api-wrap==1.2 libarchive-c @ file:///home/conda/feedstock_root/build_artifacts/python-libarchive-c_1610327409425/work llvmlite==0.36.0 locket==0.2.0 louvain==0.7.0 lxml @ file:///home/conda/feedstock_root/build_artifacts/lxml_1616532290492/work m2r2==0.3.1 magicgui==0.2.8 Markdown==3.3.4 MarkupSafe @ file:///home/conda/feedstock_root/build_artifacts/markupsafe_1610127565888/work matplotlib @ file:///home/conda/feedstock_root/build_artifacts/matplotlib-suite_1617212793321/work matplotlib-inline @ file:///home/conda/feedstock_root/build_artifacts/matplotlib-inline_1618935594181/work mccabe==0.6.1 mistune @ file:///home/conda/feedstock_root/build_artifacts/mistune_1610112875388/work mkl-fft==1.3.0 mkl-random==1.2.0 mkl-service==2.3.0 mock @ file:///home/conda/feedstock_root/build_artifacts/mock_1610094566888/work more-itertools @ file:///home/conda/feedstock_root/build_artifacts/more-itertools_1619297856442/work mpmath @ file:///home/conda/feedstock_root/build_artifacts/mpmath_1612895720168/work msgpack @ file:///home/conda/feedstock_root/build_artifacts/msgpack-python_1610121699837/work multidict==5.2.0 multipledispatch==0.6.0 multipy==0.16 mygene==3.2.2 mypy-extensions @ file:///home/conda/feedstock_root/build_artifacts/mypy_extensions_1610127166133/work napari==0.4.7 napari-console==0.0.3 napari-plugin-engine==0.1.9 napari-svg==0.1.4 natsort==7.1.1 navigator-updater==0.2.1 nbclassic @ file:///home/conda/feedstock_root/build_artifacts/nbclassic_1617878621483/work nbclient @ file:///home/conda/feedstock_root/build_artifacts/nbclient_1614336084111/work nbconvert @ file:///home/conda/feedstock_root/build_artifacts/nbconvert_1605401836768/work nbencdec==0.0.10 nbformat @ file:///home/conda/feedstock_root/build_artifacts/nbformat_1617383142101/work nest-asyncio @ file:///home/conda/feedstock_root/build_artifacts/nest-asyncio_1617163391303/work networkx==2.3 nltk @ file:///home/conda/feedstock_root/build_artifacts/nltk_1618932666659/work nose @ file:///home/conda/feedstock_root/build_artifacts/nose_1602434998960/work notebook @ file:///home/conda/feedstock_root/build_artifacts/notebook_1618415081862/work numba @ file:///home/conda/feedstock_root/build_artifacts/numba_1616707886102/work numexpr @ file:///home/conda/feedstock_root/build_artifacts/numexpr_1614971340455/work numpy==1.19.5 numpydoc @ file:///home/conda/feedstock_root/build_artifacts/numpydoc_1601580905698/work nvidia-ml-py3==7.352.0 oauthlib==3.1.1 olefile @ file:///home/conda/feedstock_root/build_artifacts/olefile_1602866521163/work opencensus==0.8.0 opencensus-context==0.1.2 opencv-python==4.5.1.48 openpyxl @ file:///home/conda/feedstock_root/build_artifacts/openpyxl_1615404969632/work opt-einsum==3.3.0 packaging @ file:///home/conda/feedstock_root/build_artifacts/packaging_1612459636436/work pandas==1.2.4 pandocfilters==1.4.2 parso==0.7.0 partd @ file:///home/conda/feedstock_root/build_artifacts/partd_1617910651905/work path @ file:///home/conda/feedstock_root/build_artifacts/path_1613949643513/work pathlib2 @ file:///home/conda/feedstock_root/build_artifacts/pathlib2_1610136800280/work pathspec @ file:///home/conda/feedstock_root/build_artifacts/pathspec_1605120834673/work pathtools==0.1.2 patsy==0.5.1 pep517==0.11.0 pep8==1.7.1 pexpect @ file:///home/conda/feedstock_root/build_artifacts/pexpect_1602535608087/work pickleshare @ file:///home/conda/feedstock_root/build_artifacts/pickleshare_1602536217715/work Pillow @ file:///home/conda/feedstock_root/build_artifacts/pillow_1617818063846/work pip-tools==6.4.0 pkginfo @ file:///home/conda/feedstock_root/build_artifacts/pkginfo_1611000265564/work pluggy @ file:///home/conda/feedstock_root/build_artifacts/pluggy_1610318077870/work ply==3.11 pooch @ file:///home/conda/feedstock_root/build_artifacts/pooch_1606467285986/work portpicker==1.2.0 poyo==0.5.0 prometheus-client @ file:///home/conda/feedstock_root/build_artifacts/prometheus_client_1617909193315/work prompt-toolkit @ file:///home/conda/feedstock_root/build_artifacts/prompt-toolkit_1616432837031/work protobuf==3.18.0 psutil @ file:///home/conda/feedstock_root/build_artifacts/psutil_1610127095720/work ptyprocess @ file:///home/conda/feedstock_root/build_artifacts/ptyprocess_1609419310487/work/dist/ptyprocess-0.7.0-py2.py3-none-any.whl PuLP==2.5.0 py @ file:///home/conda/feedstock_root/build_artifacts/py_1607783655754/work py-spy==0.3.10 pyasn1==0.4.8 pyasn1-modules==0.2.8 pycodestyle @ file:///home/conda/feedstock_root/build_artifacts/pycodestyle_1589305246696/work pycosat @ file:///home/conda/feedstock_root/build_artifacts/pycosat_1610094800877/work pycparser @ file:///home/conda/feedstock_root/build_artifacts/pycparser_1593275161868/work pycurl==7.43.0.6 pydantic==1.8.1 pydocstyle @ file:///home/conda/feedstock_root/build_artifacts/pydocstyle_1616176997890/work pyerfa @ file:///home/conda/feedstock_root/build_artifacts/pyerfa_1619385955412/work pyflakes==2.2.0 Pygments @ file:///home/conda/feedstock_root/build_artifacts/pygments_1615243893546/work pylint @ file:///home/conda/feedstock_root/build_artifacts/pylint_1614725146766/work pyls-black @ file:///home/conda/feedstock_root/build_artifacts/pyls-black_1595615126037/work pyls-spyder @ file:///home/conda/feedstock_root/build_artifacts/pyls-spyder_1613487177406/work pynndescent==0.5.2 pyodbc @ file:///home/conda/feedstock_root/build_artifacts/pyodbc_1610633331323/work PyOpenGL==3.1.5 pyOpenSSL @ file:///home/conda/feedstock_root/build_artifacts/pyopenssl_1608055815057/work pyparsing==2.4.7 PyQt5==5.15.4 PyQt5-Qt5==5.15.2 PyQt5-sip==12.8.1 PyQtChart==5.12 PyQtWebEngine==5.12.1 pyrsistent @ file:///home/conda/feedstock_root/build_artifacts/pyrsistent_1610146798212/work pysam==0.16.0.1 PySocks @ file:///home/conda/feedstock_root/build_artifacts/pysocks_1610291447907/work pytest==6.2.3 python-dateutil==2.8.0 python-igraph==0.9.1 python-jsonrpc-server @ file:///home/conda/feedstock_root/build_artifacts/python-jsonrpc-server_1599827444631/work python-language-server @ file:///home/conda/feedstock_root/build_artifacts/python-language-server_1607720213724/work python-louvain==0.15 python-slugify @ file:///home/conda/feedstock_root/build_artifacts/python-slugify_1619833713638/work pytz @ file:///home/conda/feedstock_root/build_artifacts/pytz_1612179539967/work PyWavelets @ file:///home/conda/feedstock_root/build_artifacts/pywavelets_1607290812047/work pyxdg==0.26 PyYAML==5.4.1 pyzmq @ file:///home/conda/feedstock_root/build_artifacts/pyzmq_1614611703677/work QDarkStyle @ file:///home/conda/feedstock_root/build_artifacts/qdarkstyle_1617328841504/work qstylizer @ file:///home/conda/feedstock_root/build_artifacts/qstylizer_1619474581534/work/dist/qstylizer-0.2.0-py2.py3-none-any.whl QtAwesome @ file:///home/conda/feedstock_root/build_artifacts/qtawesome_1607463875213/work qtconsole @ file:///home/conda/feedstock_root/build_artifacts/qtconsole_1615946127199/work QtPy==1.9.0 ray==1.6.0 read-roi==1.6.0 redis==3.5.3 regex @ file:///home/conda/feedstock_root/build_artifacts/regex_1617644422046/work regional==1.1.2 requests @ file:///home/conda/feedstock_root/build_artifacts/requests_1608156231189/work requests-cache==0.5.2 requests-oauthlib==1.3.0 rope @ file:///home/conda/feedstock_root/build_artifacts/rope_1618861989517/work rsa==4.7.2 Rtree @ file:///home/conda/feedstock_root/build_artifacts/rtree_1610147130400/work ruamel-yaml-conda @ file:///home/conda/feedstock_root/build_artifacts/ruamel_yaml_1611943339799/work ruamel.yaml==0.17.4 ruamel.yaml.clib==0.2.2 s3transfer==0.3.6 scanorama==1.7.1 scanpy==1.7.1 scikit-criteria==0.2.11 scikit-image==0.18.1 scikit-learn @ file:///home/conda/feedstock_root/build_artifacts/scikit-learn_1619628586492/work scipy @ file:///home/conda/feedstock_root/build_artifacts/scipy_1619561901336/work seaborn @ file:///home/conda/feedstock_root/build_artifacts/seaborn-split_1611834504644/work SecretStorage @ file:///home/conda/feedstock_root/build_artifacts/secretstorage_1612911537506/work semantic-version==2.8.5 Send2Trash==1.5.0 showit==1.1.4 simplegeneric==0.8.1 sinfo==0.3.1 singledispatch @ file:///home/conda/feedstock_root/build_artifacts/singledispatch_1614388326733/work sip==4.19.25 six @ file:///home/conda/feedstock_root/build_artifacts/six_1590081179328/work slicedimage==4.1.1 sniffio @ file:///home/conda/feedstock_root/build_artifacts/sniffio_1610318319305/work snowballstemmer @ file:///home/conda/feedstock_root/build_artifacts/snowballstemmer_1611270869511/work sortedcollections @ file:///home/conda/feedstock_root/build_artifacts/sortedcollections_1611011426343/work sortedcontainers @ file:///home/conda/feedstock_root/build_artifacts/sortedcontainers_1605110889605/work soupsieve @ file:///home/conda/feedstock_root/build_artifacts/soupsieve_1597680516047/work spectra==0.0.11 Sphinx @ file:///home/conda/feedstock_root/build_artifacts/sphinx_1616256380770/work sphinx-autodoc-typehints==1.12.0 sphinx-bootstrap-theme==0.8.0 sphinx-gallery==0.10.0 sphinxcontrib-applehelp==1.0.2 sphinxcontrib-devhelp==1.0.2 sphinxcontrib-htmlhelp==1.0.3 sphinxcontrib-jsmath==1.0.1 sphinxcontrib-programoutput==0.17 sphinxcontrib-qthelp==1.0.3 sphinxcontrib-serializinghtml==1.1.4 sphinxcontrib-websupport @ file:///home/conda/feedstock_root/build_artifacts/sphinxcontrib-websupport_1597058540894/work spyder @ file:///home/conda/feedstock_root/build_artifacts/spyder_1618606882691/work spyder-kernels @ file:///home/conda/feedstock_root/build_artifacts/spyder-kernels_1617408058532/work SQLAlchemy @ file:///home/conda/feedstock_root/build_artifacts/sqlalchemy_1619745975326/work starfish==0.2.2 statsmodels @ file:///home/conda/feedstock_root/build_artifacts/statsmodels_1612273599609/work stdlib-list==0.8.0 suds-jurko==0.6 sympy==1.5.1 tables @ file:///home/conda/feedstock_root/build_artifacts/pytables_1610156075390/work tabulate==0.8.9 tblib @ file:///home/conda/feedstock_root/build_artifacts/tblib_1616261298899/work tensorboard==2.6.0 tensorboard-data-server==0.6.1 tensorboard-plugin-wit==1.8.0 tensorflow==2.6.0 tensorflow-estimator==2.6.0 termcolor==1.1.0 terminado @ file:///home/conda/feedstock_root/build_artifacts/terminado_1617048646775/work testpath==0.4.4 text-unidecode==1.3 textdistance @ file:///home/conda/feedstock_root/build_artifacts/textdistance_1611934751977/work texttable==1.6.3 threadpoolctl @ file:///tmp/tmp79xdzxkt/threadpoolctl-2.1.0-py3-none-any.whl three-merge @ file:///home/conda/feedstock_root/build_artifacts/three-merge_1595515817927/work tifffile @ file:///home/conda/feedstock_root/build_artifacts/tifffile_1617956128193/work tinycss2 @ file:///home/conda/feedstock_root/build_artifacts/tinycss2_1616270911900/work toml @ file:///home/conda/feedstock_root/build_artifacts/toml_1604308577558/work tomli==1.2.1 tomlkit @ file:///home/conda/feedstock_root/build_artifacts/tomlkit_1614274508290/work toolz @ file:///home/conda/feedstock_root/build_artifacts/toolz_1600973991856/work torch==1.9.0 tornado==6.1 tqdm @ file:///home/conda/feedstock_root/build_artifacts/tqdm_1617680194207/work trackpy==0.4.2 traitlets @ file:///home/conda/feedstock_root/build_artifacts/traitlets_1602771532708/work typed-ast @ file:///home/conda/feedstock_root/build_artifacts/typed-ast_1618337591135/work typing-extensions @ file:///home/conda/feedstock_root/build_artifacts/typing_extensions_1602702424206/work ujson @ file:///home/conda/feedstock_root/build_artifacts/ujson_1611250195259/work umap-learn==0.5.1 unicodecsv==0.14.1 Unidecode @ file:///home/conda/feedstock_root/build_artifacts/unidecode_1612533191922/work urllib3 @ file:///home/conda/feedstock_root/build_artifacts/urllib3_1615828766818/work validators==0.18.2 vispy==0.6.6 watchdog @ file:///home/conda/feedstock_root/build_artifacts/watchdog_1610210686147/work wcwidth @ file:///home/conda/feedstock_root/build_artifacts/wcwidth_1600965781394/work webencodings==0.5.1 Werkzeug==1.0.1 whichcraft==0.6.1 widgetsnbextension @ file:///home/conda/feedstock_root/build_artifacts/widgetsnbextension_1605475534911/work wrapt @ file:///home/conda/feedstock_root/build_artifacts/wrapt_1610094884173/work wurlitzer @ file:///home/conda/feedstock_root/build_artifacts/wurlitzer_1617142491918/work xarray==0.16.2 xlrd @ file:///home/conda/feedstock_root/build_artifacts/xlrd_1610224409810/work XlsxWriter @ file:///home/conda/feedstock_root/build_artifacts/xlsxwriter_1619169883177/work xlwt==1.3.0 xmltodict==0.12.0 yapf @ file:///home/conda/feedstock_root/build_artifacts/yapf_1618790300295/work yarl==1.6.3 zict==2.0.0 zipp @ file:///home/conda/feedstock_root/build_artifacts/zipp_1614945704755/work zope.event @ file:///home/conda/feedstock_root/build_artifacts/zope.event_1600479883063/work zope.interface @ file:///home/conda/feedstock_root/build_artifacts/zope.interface_1618486014315/work
@nickeener maybe conda list
will be more interpretable for me sorry! I'm looking for your python version where you ran into the problems- we at least know it's not python 3.6 because of the scikit-image version! @njmei has some work on removing python 3.6 support, but not sure it's ready for prime time.
@berl I believe it is v3.8.8 of Python. So I should try it from an environment that uses 3.6?
@nickeener making a new environment with 3.6 will get you to the current CI testing configuration so the existing tests should run. If your code breaks in 3.6, that will be another strong motivator for getting starfish up to full 3.8 support (and probably dropping 3.6 support). In the meantime, you could also try to put together some tests for your new decoding functionality.
I'm also curious to learn more about ray
thanks for the introduction!
@berl I believe I found the source of the pytest
error. I had to install pytest-cov
in order for it to recognize the --cov
flag and pytest-xdist
for it to recognize the -n
flag. It now runs the tests when I execute using make all
but a number of them fail, even when I run it on a fresh clone of the the main starfish repo. Changing to a python 3.6 environment also does appear to make a difference. I still need to write some tests for my new additions (new to writing official unit tests like this) but how should I handle these errors in scripts I haven't even touched?
Hi @nickeener due to pretty large changes in upstream dependencies (Xarray, Numpy, Pandas, scikit-image, etc...) in the last ~1 year or so, a lot of tests and code in this repository ended up needing maintenance. I've put together a PR to address all those problems so it would be best if you could wait a bit until those changes can be merged into this main starfish repository.
looks like this PR is still waiting on travis checks... do you know how to get it to build against the new CI workflow @njmei ? does it just need a new commit?
Just needs a rebase.
@njmei no problem. Just let me know whenever it is ready so I can pull the changes and retry the tests.
@ttung is rebasing something I need to do? Sorry, not super familiar with github stuff.
I'm not sure how your local repo is configured, so it's hard to tell you exactly what commands to run. I copied your changes into #1964 though.
Typically, one would have two remotes configured, and then you'd need to pull the spacetx/starfish remote, rebase your changes on top of that, and then push to your remote (nickeener/starfish).
@neuromusic It looks like this is what is preventing tests from running. I don't have the permissions:
thanks for the ping @njmei ! approved.
@neuromusic sorry forgot to include updates for one file in the previous commit, causing the linting to fail. Should work now. Could you please rerun?
hmm.. I'm not getting these errors when I run make all
. @neuromusic are there additional mypy plugins that are being used here besides the default? I had to install a number for flake8
and pytest
that were not included in the default pip install so I assume this is similar.
hmm.. I'm not getting these errors when I run
make all
. @neuromusic are there additional mypy plugins that are being used here besides the default? I had to install a number forflake8
andpytest
that were not included in the default pip install so I assume this is similar.
@nickeener This is what gets run during the linting step: https://github.com/spacetx/starfish/blob/8741af11de920bafbc2680047340509c2be0a059/.github/workflows/starfish-prod-ci.yml#L36
In terms of how test dependencies are installed: https://github.com/spacetx/starfish/blob/8741af11de920bafbc2680047340509c2be0a059/.github/workflows/starfish-prod-ci.yml#L62
@njmei Thank you so much the make install-dev
command was exactly what I needed and I was able to fix those linting errors. make all
runs all the way through without errors again. @neuromusic fingers crossed but it should pass linting and tests now. Could you please rerun it? Thanks!
@nickeener Now that you've been approved to run tests, they should just auto-run anytime to add a new commit.
@njmei awesome thanks! That makes things simpler.
@njmei there appears to be some issue with my approval status as the tests have not auto-run after my latest commit and there doesn't appear to be any option to manually start the workflow. It still says First-time contributors need a maintainer to approve running workflows
.
Also, I've added the new python library ray
to the REQUIREMENTS.txt file in the main starfish directory. Do I also need to add it to the requirements/REQUIREMENTS-CI.txt.in
and requirements/REQUIREMENTS-CI.txt
files?
@nickeener Hmmm.... yeah that's annoying that the test runner status looks like it reverted. Unfortunately, I'm a volunteer software engineer unaffiliated with CZI/spacetx so there's not much I can do on my end. You should be able to get the tests to run from your fork though, have you checked the actions
menu in your fork of Starfish?
Regarding the PR itself:
I didn't realize that ray
was a necessary dependency for this method. Adding new dependencies gives me some pause since starfish (to the best of my knowledge) is intended to work on Windows, mac, and *nix systems. Reading up a bit more on ray
(https://docs.ray.io/en/latest/installation.html#windows-support) makes me wonder if the tests you currently have will play well with the Windows test runners...
There is also a question of maintenance, it looks like ray
has it's own set of complicated dependencies that for now seem okay (though maybe the scipy
requirement could clash with what starfish now has pinned), but given that there is not dedicated engineering support for starfish
what I worry about the most is a repeat of https://github.com/spacetx/starfish/pull/1963 half a year or a year down the line.
Is there a way to replace ray
with multiprocessing
?
Keep in mind though, these are just my opinions and the actual owners/contributers (@neuromusic) will need to probably weigh in...
sorry about the trouble here with the workflow approvals @nickeener ... thanks to @njmei, we are now using Actions for CI, but I didn't realize this was something that needed configured. I found the setting and changed the permissions and hopefully 🤞 I don't need to "approve" your runs anymore
as for adding the ray
requirement... in addition to REQUIREMENTS.txt
, to get it to run on CI, you'll also need to run make requirements/REQUIREMENTS-CI.txt
to update that file which is used to pin the requirements for CI
the maintenance question is definitely a concern... we've currently in a (very) slim "maintenance" mode, but I'm exploring ways to bring on more support for maintenance. as for ray
specifically... I agree with @njmei that if multiprocessing
will do the job, that's preferable since we already use multiprocessing extensively.
however, it's not clear to me that that's even necessary... I'm not familiar with ray
, but after a quick glance at the docs, its actually not clear to me how ray is being used here.... I see it initialized and shutdown, but I don't see it used otherwise (no decorators etc)? is it necessary?
@neuromusic ray is being initialized in the CheckAll
class's run
function and is then used by several of the functions that are imported from the check_all_funcs.py
script. There's several points where's it's advantageous to chunk the data and run it in parallel. I should be able to replace it with multiprocessing
though it may take a speed hit as ray
was designed as kind of a optimized version of multiprocessing
so I assume it will be faster but I haven't tested this. Though that isn't as much of a problem as it was a week ago as I've made some significant speed improvements to the algorithm since making this PR and the graph showing the run times is now out of date. Might take me some time to adjust everything as I'm not familiar with multiprocessing
.
used by several of the functions that are imported from the
check_all_funcs.py
script
aha! that's why my CTRL-F failed me :P
thank you!!
@neuromusic I've updated this PR to replace ray
with multiprocessing
. Passes all checks now.
@neuromusic Hey just wanted to give a quick overview of some of the updates I've made to this method since I originally made this PR.
I changed the way it chooses between spot combinations that use the same spot. The previous method simply chose the code that had the minimum spatial variance for its spots, updated method treats it like a maximum independent set problem where it wants to find the set of spot combinations that use each spot only once and also use as many of the spots as possible. This is an NP-complete problem but I was able to leverage the spatial variance of the spots in each possible combination to make it fast. This resulted in a ~30% increase in the total number of mRNA targets that can be identified (in my test data set) while also slightly increasing accuracy (by correlation with smFISH results).
The following figure is similar to the one in my original post but I've added a new line for the updated results. The left figure shows the number of transcripts identified by the old and updated method compared to starFISHs nearest neighbor decoder while the center and right figure show the accuracy by correlation with values obtained using smFISH (center figure just zoomed in version of right figure).
I've also made significant improvements to the run time and memory requirements of the method. This figure shows the run time improvements.
The previous version took over 6 hours to run using a search radius of 2.45 while the current version now takes just 51 minutes to do the same while also using half as much memory (don't have figures for memory unfortunately). I also discovered that the server I've previously been running many of my tests on is somewhat dated and slow as a result so tested running the decoder on my local system and saw another significant drop in run times with that same run only taking 11 minutes. My local system isn't particularly powerful either so I expect it'd be even less on a more modern server system.
Multiprocessing now uses the python standard library module instead of ray. If you'd like me to make any other changes to get this approved please let me know.
It appears to be failing one of the tests now (Docker Smoketest). Are there recent updates that I need to pull to my fork?
Replaced by #1978
This PR adds a new spot-based decoding method, the CheckAll decoder, based on the method described here (https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6046268/). It is capable of detecting several times more true targets from seqFISH image data than current spot-based methods in starFISH (PerRoundMaxChannel) as barcodes are not restricted to exactly matching spots or only nearest neighbor spots and because it tries to put together barcodes based on spots from every round instead of a single arbitrary anchor round. It is also capable of utilizing error correction rounds in the codebook which current starFISH methods do not consider.
Summary of algorithm:
Inputs: spots - starFISH SpotFindingResults object codebook - starFISH Codebook object filter_rounds - Number of rounds that a barcode must be identified in to pass filters error_rounds - Number of error-correction rounds built into the codebook (ie number of rounds that can be dropped from a barcode and still be able to uniquely match to a single target in the codebook)
Tests of the CheckAll decoder vs starFISH's PerRoundMaxChannel method (w/ nearest neighbor trace building strategy) show improved performance with the CheckAll decoder. All the following tests used seqfISH image data from https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6046268/.
Note PRMC NN = PerRoundMaxChannel Nearest Neighbor
The x-axis for each of the above images marks the value for the search radius parameter used in either decoding method (the distance that spots can be from a reference spot and be allowed to form a potential barcode). It is marked in increments of increasing symmetric neighborhood size (in 3D). The left figure shows the total number of decoded transcripts that are assigned to a cell for each method (note: for the CheckAll decoder this includes partial barcodes (codes that did not use all rounds in decoding) which the PerRoundMaxChannel method does not consider). Depending on the search radius, there is as much as a 442% increase in the total number of decoded barcodes for the CheckAll decoder vs PerRoundMaxChannel.
To assess the accuracy of either decoding method, I used orthologous smFISH data that was available from the same samples for several dozen of the same genes as were probed in the seqFISH experiment. Using this data, I calculated the Pearson correlation coefficient for the correlation between the smFISH data and the results from decoding the seqFISH data with either method (note: because the targets in this dataset were introns (see paper) the values that were correlated were the calculated burst frequencies for each gene (how often/fast transcription is cycled on/off) instead of counts). The results of this are shown in the center figure above with the right-hand figure showing the same data but zoomed out to a 0-1 range. The starFISH PerRoundMaxChannel method does achieve a higher accuracy using this test but it is not significant and comes at the cost of detecting far fewer barcodes. (Note: missing values on lower end of x-axis are due to not having enough results to calculate the burst frequency of the transcripts).
Unlike current starFISH methods, the CheckAll decoder is capable of taking advantage of error correction rounds built into the codebook. As an example, say a experiment is designed with a codebook that has 5 rounds, but the codes are designed in such a way that only any 4 of those rounds are needed to uniquely match a barcode to a target, the additional round would be considered an error correction round because you may be able to uniquely identify a barcode as a specific target with only 4 rounds, but if you can also use that fifth round you can be extra confident that the spot combination making up a barcode is correct. This method is based on a previous pull request made by a colleague of mine (https://github.com/ctcisar/starfish/pull/1).
The above figures show similar results to the first figure except the results of the CheckAll decoder have been split between barcodes that were made using spots in all rounds (error correction) and those that only had a partial match (no correction). Even without considering error correction, the CheckAll decoder detects as much as 181% more barcodes than the PerRoundMaxChannel method. The smFISH correlation are as expected with error corrected barcodes achieving a higher correlation score with the smFISH data than those that were not corrected. Whether a barcode in the final DecodedIntensityTable uses an error correction round or not can be extracted from the new "rounds_used" field which shows the number of rounds used to make a barcode for each barcode in the table. This allows easy separation of data into high and lower confidence calls. Additionally, the distance field of the DecodedIntensityTable is no longer based on the intensity of the spots in each barcode but is the value that is calculated for the sum of variances of the list of spatial coordinates for each spot in a barcode. This can also be used as a filter in many cases as barcodes made of more tightly clustered spots may be more likely to be true targets.
The major downside to the CheckAll decoder is it's speed. This is no surprise, as it is searching the entire possible barcode space for every spot from all rounds instead of just nearest neighbors to spots in a single round, but the possible barcode space can become quite large as search radius increases which can significantly increase run times. In order to address this, I've added the ability to multi-thread the program and run multiple chunks simultaneously in parallel using the python module ray, though even with this added parallelization, runtimes for CheckAll are much higher than for PerRoundMaxChannel. The above figure shows the runtime in minutes for the CheckAll decoder (using 16 threads) vs PerRoundMaxChannel with nearest neighbors (note: the seqFISH dataset used here is among the larger that are available at 5 rounds, 12 channels, and over 10,000 barcodes in the codebook so for most other seqFISH datasets I expect runtimes will be considerably less than what is shown here, unfortunately I did not have access to another suitable seqFISH dataset to test on). Ongoing work is being done to optimize the method and bring runtimes down. I was unable to figure out how to correctly add ray to the requirements file so that will still need to be done.