-
### Your current environment
```text
The output of `python collect_env.py`
Collecting environment information...
PyTorch version: 2.3.0+cu121
Is debug build: False
CUDA used to build PyTorch: 12…
-
Some llama integration tests (e.g. `test_llamacpp_various_regexes`) fail for llama-cpp-python >= 0.2.38.
Investigate this further.
-
### Your current environment
PyTorch version: 2.3.0+cu121
Is debug build: False
CUDA used to build PyTorch: 12.1
ROCM used to build PyTorch: N/A
OS: Ubuntu 22.04.4 LTS (x86_64)
GCC version: (U…
-
### Your current environment
```text
Collecting environment information...
WARNING 06-13 12:05:09 _custom_ops.py:14] Failed to import from vllm._C with ModuleNotFoundError("No module named 'vllm._C…
-
### Your current environment
H100 (but I believe it happens in any machine)
### 🐛 Describe the bug
```
--enable-chunked-prefill --num-max-batched-tokens 2048 --kv-cache-dtype "fp8"
```
S…
-
### Reminder
- [X] I have read the README and searched the existing issues.
### System Info
- `llamafactory` version: 0.8.2.dev0
- Platform: Linux-5.15.0-69-generic-x86_64-with-glibc2.31
- …
-
Application Basics
------------------
Name: Firefox
Version: 129.0
Build ID: 20240801122119
Distribution ID: mozilla-flatpak
Update Channel: release
User Agent: Mozilla/5.0 (X11; Linux x86_64…
mcepl updated
1 month ago
-
### Your current environment
```
PyTorch version: 2.1.2
Is debug build: False
CUDA used to build PyTorch: 12.1
ROCM used to build PyTorch: N/A
OS: Ubuntu 20.04.6 LTS (x86_64)
GCC version: (Ub…
-
## Motivation
Currently, extension methods do not support adding static methods/factory constructors. But this is a missed opportunity!
There are many situations where semantically we want a sta…
-
I have [a long comment](https://users.rust-lang.org/t/rust-as-a-high-level-language/4644/72) at the Rust forum (some of the inspiration came from @keean linking me to Sean Parent's [video](https://you…