issues
search
marella
/
ctransformers
Python bindings for the Transformer models implemented in C/C++ using GGML library.
MIT License
1.8k
stars
137
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
Link in readme broken
#213
KansaiUser
opened
1 month ago
0
model not loading on GPU
#212
kot197
opened
2 months ago
0
Problem accessing libctransformers.so
#211
Ajayvenki
opened
2 months ago
1
Support for Llama3
#210
gultar
opened
5 months ago
2
OSError: .......cannot open shared object file: No such file or directory
#209
Saurabh11811
opened
5 months ago
3
Error when trying to run on kali linux
#208
Kuro0911
opened
5 months ago
0
Add Support for Google/Gemma-2b-it
#207
Arya920
closed
5 months ago
0
GGUF MODEL INFERENCE
#206
Bamore01
opened
7 months ago
0
Inputting embeddings directly
#205
liechtym
opened
7 months ago
0
Does ctransformers support ollama models?
#204
PriyaranjanMarathe
opened
7 months ago
1
Add support for Google's Gemma models
#203
gultar
opened
7 months ago
0
Does ctransformers boost the inference speed in llm inference?
#202
pradeepdev-1995
opened
7 months ago
0
How to load the finetuned model in safetensors format(not in gguf)
#201
pradeepdev-1995
opened
7 months ago
0
fix: context_length has no effect
#200
chosen-ox
opened
8 months ago
0
Cannot generate text on GPU
#199
congson1293
opened
9 months ago
3
Not working with gpu_layers
#198
MNekoRain
closed
9 months ago
2
Add support for Microsoft Phi-2
#197
niutech
opened
9 months ago
0
Unsupported Model : Zephyr 'stablelm' GGUF
#196
Jonathanjordan21
opened
9 months ago
0
Pulling models outside of hf?
#195
shell-skrimp
opened
9 months ago
0
Multimodal models compatibility
#194
ParisNeo
opened
9 months ago
0
precompiled rocm and metal wheels
#193
ParisNeo
opened
9 months ago
0
Infinite token generation
#192
yukiarimo
closed
9 months ago
1
GPTQ models are not respecting context_length or max_seq_len settings
#191
chrsbats
opened
10 months ago
0
OSError: [WinError 1114] A dynamic link library (DLL) initialization routine failed
#190
saurabhbluebenz
opened
10 months ago
0
Unclear error: GGML_ASSERT: D:\a\ctransformers\ctransformers\models\ggml/llama.cpp:453: data
#189
deveolper
opened
10 months ago
3
request feature: RAG of local docs
#188
nimzodisaster
opened
10 months ago
0
Fine-tuning option?
#187
yukiarimo
opened
10 months ago
0
OSError: /lib/x86_64-linux-gnu/libm.so.6: version `GLIBC_2.29' not found
#186
luohao123
opened
10 months ago
1
Build error with GCC12: /usr/lib/gcc/aarch64-linux-gnu/12/include/arm_neon.h:29182:1: error: inlining failed in call to ‘always_inline’ ‘vfmaq_f16’: target specific option mismatch
#185
denverdino
opened
10 months ago
0
NotImplementedError when using CTransformers AutoTokenizer
#184
NasonZ
opened
10 months ago
3
Segfault with DeepSeek GGUF models
#183
freckletonj
closed
10 months ago
3
Request: `stopping_criteria`
#182
freckletonj
opened
10 months ago
0
Mistral Sliding Window Attention (SWA)
#181
stygmate
opened
10 months ago
1
Something wrong with a generator
#180
yukiarimo
closed
10 months ago
1
[Question] Run CTransformer with oracle linux server hits error with libctransformers.so
#179
guanw
closed
10 months ago
1
Adapt to Upcoming pip Behavior Change for --no-binary Option
#178
wenboown
opened
10 months ago
0
Model not loading on GPU
#177
AndreaLombax
opened
10 months ago
1
core dumped / segmentation fault
#176
lysa324
opened
11 months ago
1
Everything OK? Abandoned?
#175
TheBloke
opened
11 months ago
10
Will rwkv be supported?
#174
lin-calvin
opened
11 months ago
0
Out of memory exits process
#173
kczimm
opened
11 months ago
0
Slow + No config options
#172
yukiarimo
closed
10 months ago
2
Support for cuda 11.8 and above
#171
sujeendran
opened
11 months ago
0
Cuda 11.8
#170
JeanChristopheMorinPerso
opened
11 months ago
1
Mistral-7b Error: AttributeError: 'str' object has no attribute 'tolist'
#169
JannikSchneider12
opened
11 months ago
1
Requesting support for 'CausalLM' models
#168
SixftOne
opened
11 months ago
0
AutoModelForCausalLM.from_pretrained(.., gpu_layers=..) gives Windows Error 0xc000001d
#167
JeremyBickel
opened
11 months ago
1
logprobs are greater than 0
#166
RevanthRameshkumar
opened
11 months ago
0
CUDA error 222 at D:\a\ctransformers\ctransformers\models\ggml\ggml-cuda.cu:6045: the provided PTX was compiled with an unsupported toolchain.
#165
AnhNgDo
opened
11 months ago
2
How to handle the token limitation for a LLM response?
#164
phoenixthinker
opened
11 months ago
2
Next