-
Hi i'm installing the tool to build the 3d mesh inside a conda environment where i've installed torch with cuda 118
torch 2.1.2+cu118
but when i run the command:
pip install g…
-
AMD GPUs are seeing increased adoption. ROCm has nice compatibility layers with PyTorch, too. Plus, ROCm-SMI (apparently) has all the energy-related management APIs we need -- measuring power and ener…
-
### Describe the issue
According to the instructions on binding PyTorch tensor in the official document, I tried to bind a tensor running on a CUDA device, and then interrupted directly after runni…
-
root@localhost:/home/ocrtrain/train/ocr/warp-ctc/pytorch_binding# python setup.py install
running install
running bdist_egg
running egg_info
creating warpctc_pytorch.egg-info
writing warpctc_pyto…
-
### System Info
While working on [GTPQModel](https://github.com/modelcloud/gptqmodel) which does gptq quantization of hf models and load each layer on to gpu, quantize, and then move layer back to …
-
Torch was not built with CUDA support, not building warp-ctc GPU extensions.
running install
running bdist_egg
running egg_info
creating warpctc_pytorch.egg-info
writing warpctc_pytorch.egg-info/…
-
Inference time for onnxruntime gpu starts reversing (increasing) from batch size 128 onwards
**System information**
OS Platform and Distribution (e.g., Linux Ubuntu 16.04): Ubuntu 18.04
…
-
### Checklist
- [X] I have searched for [similar issues](https://github.com/isl-org/Open3D/issues).
- [x] For Python issues, I have tested with the [latest development wheel](https://www.open3d.org/d…
-
I am pytorch user, and I am curious this issue is generally occurs in every platform of CTC.
Originally, this issue is posted in https://github.com/SeanNaren/deepspeech.pytorch/issues/250.
I am …
-
I'm looking to build an automatic differentiation library for TPUs without using high-level front-ends like TensorFlow/JAX/PyTorch-XLA, but I'm finding information about lower-level TPU usage is pract…