Closed hadim closed 2 years ago
Hi! This is the friendly automated conda-forge-linting service.
I just wanted to let you know that I linted all conda-recipes in your PR (recipe
) and found it was in an excellent condition.
@scopatz @jakirkham do you mind having a quick look whenever you have time? (this is not urgent at all)
I am trying to build this package against cuda and pytorch.
Here is the setup.py
file used:
import importlib
from setuptools import setup, find_packages
ext_modules = []
cmdclass = {}
# Check whether torch is available
if importlib.util.find_spec("torch") is not None:
import torch
# Check whether Cuda is available
if torch.cuda.is_available():
from torch.utils.cpp_extension import BuildExtension, CUDAExtension
cuda_version = float(torch.version.cuda)
print(f"Building with CUDA support for CUDA: {cuda_version}")
nvcc_args = [
"-gencode=arch=compute_50,code=sm_50",
"-gencode=arch=compute_60,code=sm_60",
"-gencode=arch=compute_61,code=sm_61",
"-gencode=arch=compute_70,code=sm_70",
"-Xptxas=-v",
"--expt-extended-lambda",
"-use_fast_math",
]
if cuda_version >= 10:
nvcc_args.append("-gencode=arch=compute_75,code=sm_75")
if cuda_version >= 11:
nvcc_args.append("-gencode=arch=compute_80,code=sm_80")
if cuda_version >= 11.1:
nvcc_args.append("-gencode=arch=compute_86,code=sm_86")
ext_modules = [
CUDAExtension(
name="cuaev",
sources=["torchani/extension/aev.cu"],
extra_compile_args={"cxx": ["-std=c++14"], "nvcc": nvcc_args},
)
]
cmdclass = {"build_ext": BuildExtension}
with open("README.md", "r") as fh:
long_description = fh.read()
setup(
name="torchani",
description="PyTorch implementation of ANI",
long_description=long_description,
long_description_content_type="text/markdown",
url="https://github.com/aiqm/torchani",
author="Xiang Gao",
author_email="qasdfgtyuiop@gmail.com",
license="MIT",
packages=find_packages(),
include_package_data=True,
use_scm_version=True,
setup_requires=["setuptools_scm"],
install_requires=["torch", "lark-parser", "requests"],
ext_modules=ext_modules,
cmdclass=cmdclass,
)
The idea is to build the package both without cuda (no extension are built) and also with cuda. The cuda kernel also needs to be built against pytorch.
The build seems to work locally but it looks like the pytorch version selected during the build phase is only the cpu version instead of the cuda version when cuda_compiler_version != "None"
. In consequence pip install
does not trigger any cuda build (see setup.py
for the logic).
Could you check the cuda logic in the recipe is correct?
Also, my understanding is that once the build is working, then conda will automatically select the correct torchani version according to whether the conda env has cuda installed or not and fallback to the cuda=None
version if no cudatoolkit detected (or cpuonly
is installed?). Can you confirm my understanding is correct here?
@conda-forge-admin please rerender
Hi! This is the friendly automated conda-forge-linting service.
I wanted to let you know that I linted all conda-recipes in your PR (recipe
) and found some lint.
Here's what I've got...
For recipe:
build/number
section.Hi! This is the friendly automated conda-forge-linting service.
I just wanted to let you know that I linted all conda-recipes in your PR (recipe
) and found it was in an excellent condition.
@conda-forge-admin please rerender
Hi! This is the friendly automated conda-forge-webservice. I tried to rerender for you, but it looks like there was nothing to do.
@conda-forge-admin please rerender
Hi! This is the friendly automated conda-forge-webservice. I tried to rerender for you, but it looks like there was nothing to do.
I feel like it's not possible to build using torch/extension.h
since the "correct" pytorch packages are neither on conda-forge and default channels.
@conda-forge-admin please rerender
Hi! This is the friendly automated conda-forge-webservice. I tried to rerender for you, but it looks like there was nothing to do.
Ok even adding the pytorch channel doesn't work. I am clueless at this point :-D
Try to follow what is being done for this recipe https://github.com/conda-forge/staged-recipes/pull/13659/files?
Hi! This is the friendly automated conda-forge-linting service.
I wanted to let you know that I linted all conda-recipes in your PR (recipe
) and found some lint.
Here's what I've got...
For recipe:
Hi! This is the friendly automated conda-forge-linting service.
I just wanted to let you know that I linted all conda-recipes in your PR (recipe
) and found it was in an excellent condition.
Closing in favour of #2.
Checklist
0
(if the version changed)conda-smithy
(Use the phrase code>@<space/conda-forge-admin, please rerender in a comment in this PR for automated rerendering)DO NOT MERGE: