issues
search
janhq
/
cortex.llamacpp
cortex.llamacpp is a high-efficiency C++ inference engine for edge computing. It is a dynamic library that can be loaded by any server at runtime.
GNU Affero General Public License v3.0
22
stars
3
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
Update llama.cpp submodule to latest release b4132
#296
jan-service-account
opened
5 hours ago
0
Update llama.cpp submodule to latest release b4122
#295
jan-service-account
opened
1 day ago
0
Update llama.cpp submodule to latest release b4118
#294
jan-service-account
closed
2 days ago
0
Update llama.cpp submodule to latest release b4082
#293
jan-service-account
closed
4 days ago
0
Update llama.cpp submodule to latest release b4079
#292
jan-service-account
closed
4 days ago
0
Update llama.cpp submodule to latest release b4071
#291
jan-service-account
closed
4 days ago
0
Update llama.cpp submodule to latest release b4067
#290
jan-service-account
closed
4 days ago
0
Fix/log probs
#289
nguyenhoangthuan99
closed
1 week ago
0
Update llama.cpp submodule to latest release b4066
#288
jan-service-account
closed
1 week ago
0
Update llama.cpp submodule to latest release b4062
#287
jan-service-account
closed
1 week ago
0
Update llama.cpp submodule to latest release b4050
#286
jan-service-account
closed
1 week ago
0
Update llama.cpp submodule to latest release b4042
#285
jan-service-account
closed
1 week ago
0
Update llama.cpp submodule to latest release b4038
#284
jan-service-account
closed
1 week ago
0
Add openai compatible embedding
#283
nguyenhoangthuan99
closed
1 week ago
0
Update llama.cpp submodule to latest release b4034
#282
jan-service-account
closed
1 week ago
0
Update llama.cpp submodule to latest release b4027
#281
jan-service-account
closed
1 week ago
0
Update llama.cpp submodule to latest release b4019
#280
jan-service-account
closed
1 week ago
0
fix: backward compatible for nprobs
#279
nguyenhoangthuan99
closed
1 week ago
0
Update llama.cpp submodule to latest release b4007
#278
jan-service-account
closed
2 weeks ago
0
Update llama.cpp submodule to latest release b3998
#277
jan-service-account
closed
1 week ago
0
support OpenAI compatible log probs
#276
nguyenhoangthuan99
closed
2 weeks ago
0
Update llama.cpp submodule to latest release b3995
#275
jan-service-account
closed
2 weeks ago
0
feat: support multiple choices
#274
nguyenhoangthuan99
closed
2 weeks ago
0
fix: check nullptr for async_file_logger_
#273
vansangpfiev
closed
2 weeks ago
0
Update llama.cpp submodule to latest release b3989
#272
jan-service-account
closed
2 weeks ago
0
Update llama.cpp submodule to latest release b3985
#271
jan-service-account
closed
3 weeks ago
0
Feat/support logit bias
#270
nguyenhoangthuan99
closed
3 weeks ago
0
feat: support stream_option for OpenAI API compatible
#269
vansangpfiev
closed
3 weeks ago
0
Update llama.cpp submodule to latest release b3982
#268
jan-service-account
closed
3 weeks ago
0
Update llama.cpp submodule to latest release b3976
#267
jan-service-account
closed
3 weeks ago
0
Update llama.cpp submodule to latest release b3972
#266
jan-service-account
closed
3 weeks ago
0
feat: [support stream_option for OpenAI API compatible]
#265
nguyenhoangthuan99
closed
1 week ago
1
feat: [support return multiple choices]
#264
nguyenhoangthuan99
closed
2 weeks ago
3
feat: [support logit_bias for OpenAI API compatible]
#263
nguyenhoangthuan99
closed
3 weeks ago
1
feat: [support log prob like OpenAI API]
#262
nguyenhoangthuan99
closed
2 weeks ago
5
Update llama.cpp submodule to latest release b3967
#261
jan-service-account
closed
3 weeks ago
0
Update llama.cpp submodule to latest release b3962
#260
jan-service-account
closed
3 weeks ago
0
fix: warm-up
#259
vansangpfiev
closed
4 weeks ago
1
Update llama.cpp submodule to latest release b3950
#258
jan-service-account
closed
4 weeks ago
0
Update llama.cpp submodule to latest release b3943
#257
jan-service-account
closed
3 weeks ago
0
Update llama.cpp submodule to latest release b3943
#256
jan-service-account
closed
4 weeks ago
0
fix: nightly workflow
#255
vansangpfiev
closed
4 weeks ago
0
fix: pr sync workflow
#254
vansangpfiev
closed
4 weeks ago
0
fix: support chat completion content array type
#253
vansangpfiev
closed
1 month ago
0
Update llama.cpp submodule to latest release b3849
#252
jan-service-account
closed
1 month ago
0
chore: return vram usage
#251
namchuai
closed
1 month ago
1
Update llama.cpp submodule to latest release b3841
#250
jan-service-account
closed
1 month ago
0
Update llama.cpp submodule to latest release b3829
#249
jan-service-account
closed
1 month ago
0
Update llama.cpp submodule to latest release b3828
#248
jan-service-account
closed
1 month ago
0
fix/pipeline-update model.yml fail
#247
nguyenhoangthuan99
closed
1 month ago
0
Next