issues
search
ngxson
/
wllama
WebAssembly binding for llama.cpp - Enabling on-browser LLM inference
https://huggingface.co/spaces/ngxson/wllama
MIT License
444
stars
23
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
sync to latest upstream source code
#132
ngxson
closed
5 days ago
0
llm_load_vocab: special_eos_id is not in special_eog_ids - the tokenizer config may be incorrect
#131
flatsiedatsie
closed
3 weeks ago
2
Add WllamaError class, fix llama_decode hangs on long input text
#130
ngxson
closed
3 weeks ago
0
sync to latest upstream source code
#129
ngxson
closed
1 month ago
0
papeg.is ready!
#128
flatsiedatsie
opened
1 month ago
7
Differences in template application
#127
flatsiedatsie
opened
1 month ago
6
Just a heads up: Wllama crashes on Mobile Chrome DEV
#126
flatsiedatsie
opened
1 month ago
0
sync to latest upstream source code
#125
ngxson
closed
1 month ago
0
loadModelFromUrl error: SyntaxError: Unexpected token '', "�2�r���f"... is not valid JSON
#124
flatsiedatsie
closed
1 month ago
3
Error: Module is already initialized
#123
flatsiedatsie
opened
1 month ago
0
How to best use allow_offline?
#122
flatsiedatsie
closed
1 month ago
1
calling .exit() has unexpected result: wllamaExit is not a function
#121
flatsiedatsie
opened
1 month ago
6
cannot find tokenizer merges in model file
#120
flatsiedatsie
opened
2 months ago
18
Update to latest llama.cpp source code
#119
ngxson
closed
2 months ago
0
decode/encode : do not fail on empty batch
#118
ngxson
closed
2 months ago
0
table_index is out of bounds
#117
flatsiedatsie
opened
2 months ago
2
RangeError: Array buffer allocation failed
#116
flatsiedatsie
opened
2 months ago
0
Firefox: Error in input stream
#115
flatsiedatsie
opened
2 months ago
4
[bug]: llama_decode error when send the same prompt twice
#114
cbh778899
closed
2 months ago
1
v1.16.1
#113
ngxson
closed
2 months ago
0
Feature: list the available local models from the cache
#112
synw
opened
2 months ago
2
Unable to import in React.js
#111
cbh778899
closed
2 months ago
3
Bug: `createCompletion` stuck when it runs out of context
#110
ngxson
opened
3 months ago
0
ability to use custom cacheManager
#109
ngxson
closed
3 months ago
0
[Feature Request] Allow setting our own Cache Manager
#108
felladrin
closed
3 months ago
1
What does `noTEE` do?
#107
flatsiedatsie
closed
3 months ago
2
Phi-3: error loading model hyperparameters
#106
flatsiedatsie
closed
3 months ago
5
[Feature request] LoRA support
#105
OKUA1
opened
3 months ago
2
v1.15.0
#104
ngxson
closed
3 months ago
0
implement KV cache reuse
#103
ngxson
closed
3 months ago
0
Improve main UI example
#102
ngxson
closed
3 months ago
0
implement KV cache reuse for completion
#101
ngxson
closed
3 months ago
0
fix log print and `downloadModel`
#100
ngxson
closed
3 months ago
0
Add `main` example (chat UI)
#99
ngxson
closed
3 months ago
0
Add prettier
#98
ngxson
closed
3 months ago
0
ci: add e2e test
#97
ngxson
opened
3 months ago
0
main: initialize main example
#96
ngxson
closed
3 months ago
2
Add `downloadModel` function
#95
ngxson
closed
3 months ago
0
v1.14.2
#94
ngxson
closed
4 months ago
0
v1.14.1, update to latest upstream source code
#93
ngxson
closed
4 months ago
0
v1.14.0
#92
ngxson
closed
4 months ago
0
Support llama_encode (WIP)
#91
ngxson
closed
4 months ago
4
save ETag metadata, add allowOffline
#90
ngxson
closed
4 months ago
1
Add support for control vectors
#89
ngxson
opened
4 months ago
0
Force use of the cache if there is no internet connection
#88
flatsiedatsie
closed
4 months ago
6
Model caching with new download manager?
#87
flatsiedatsie
closed
4 months ago
1
T5 and Flan-T5 models support (llama_encode)
#86
felladrin
closed
4 months ago
1
v1.13.0
#85
ngxson
closed
4 months ago
0
Fix exit() function crash if model is not loaded
#84
flatsiedatsie
closed
4 months ago
1
Add support for `AbortController` on downloading model
#83
flatsiedatsie
opened
4 months ago
1
Next