issues
search
elixir-nx
/
bumblebee
Pre-trained Neural Network models in Axon (+ 🤗 Models integration)
Apache License 2.0
1.26k
stars
90
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
Change image size to maps in image featurizers
#329
jonatanklosko
closed
4 months ago
0
Less restrictive unzip dependency
#328
sonic182
closed
4 months ago
1
Downcase label before checking equality
#327
preciz
closed
4 months ago
0
Add zero-shot vision
#326
seanmor5
opened
5 months ago
1
Error when using TinyLlama
#325
trickster
closed
4 months ago
2
Weird behaviour with progress status.
#324
hickscorp
closed
5 months ago
2
Featurizer different from python?
#323
sonic182
closed
5 months ago
10
Pass PRNG key to schedulers in Stable Diffusion
#322
jonatanklosko
closed
6 months ago
0
Using image.file_ref
#321
jlxq0
closed
6 months ago
0
Add LCM scheduler
#320
wtedw
closed
6 months ago
5
Remove Bumblebee.Utils.Nx.to_list/1 in favour of Nx.to_list/1
#319
thiagopromano
closed
6 months ago
0
Update Stable Diffusion notebook
#318
jonatanklosko
closed
6 months ago
0
Reduce memory used by :preallocate_params
#317
jonatanklosko
closed
6 months ago
0
Improve slow tests check
#316
jonatanklosko
closed
6 months ago
0
`TextEmbedding` crashes when both Mean Pooling and `compile` opts is specified
#315
thiagopromano
closed
6 months ago
2
Handle repository redirects and skip authorization header for LFS
#314
jonatanklosko
closed
6 months ago
0
Improve the error message when Hugging Face resource isn't found a the referenced location.
#313
meanderingstream
closed
6 months ago
1
Cannot use `whisper-*.en` models in bumblebee
#312
John-Goff
closed
6 months ago
3
Add :type option to load model under specific precision
#311
jonatanklosko
closed
6 months ago
0
Group all tokenizers undera single module and configure upfront
#310
jonatanklosko
closed
6 months ago
0
Streamline loading for params variants
#309
jonatanklosko
closed
6 months ago
0
Remove conversational serving
#308
jonatanklosko
closed
6 months ago
0
Apply tokenizer truncation before post-processor
#307
jonatanklosko
closed
6 months ago
0
Token Truncation differs from Transformers implementation
#306
thiagopromano
closed
6 months ago
0
Fix loading more recent VaeKl checkpoints
#305
jonatanklosko
closed
6 months ago
0
Load special tokens from tokenizer_config.json
#304
jonatanklosko
closed
6 months ago
0
Make randomization seed a serving input, rather than compile option
#303
jonatanklosko
closed
6 months ago
0
Return only new text from text generation
#302
jonatanklosko
closed
6 months ago
0
Automatically detect diffusers params files
#301
jonatanklosko
closed
6 months ago
0
Refactor attention implementation
#300
jonatanklosko
closed
6 months ago
1
Fix cache offset casting with low precision policies
#299
jonatanklosko
closed
6 months ago
0
Apply cross attention spec unet
#298
robinmonjo
closed
4 months ago
4
Rewrite tests to use tiny model checkpoints
#297
jonatanklosko
closed
7 months ago
0
Text completion behavior is different when streaming vs not streaming
#296
brainlid
closed
7 months ago
1
Allow text completion streaming true/false option on a per-call basis
#295
brainlid
closed
7 months ago
2
Add Starcoder
#294
jonatanklosko
closed
7 months ago
0
Bumblebee fails to load gpt models in main.
#293
ityonemo
closed
7 months ago
1
running 'elixir speech_to_text.exs' UndefinedFunctionError
#292
bartonhammond
closed
7 months ago
5
Add annotations to QKV layers
#291
seanmor5
opened
7 months ago
0
Add temperature to generation options
#290
jonatanklosko
closed
7 months ago
0
Use tokenizer_config.json as the primary source of metadata
#289
jonatanklosko
closed
6 months ago
0
Halting Nx Serving streams with a stop token
#288
zblanco
closed
7 months ago
4
Support returning token count information
#287
brainlid
closed
4 months ago
9
Support temperature in generation options
#286
jonatanklosko
closed
7 months ago
0
Support more rotary embedding options for Llama
#285
jonatanklosko
closed
7 months ago
0
Support non-deterministic output in text generation serving
#284
jonatanklosko
closed
6 months ago
0
Compute BLIP image embeddings only once during generation
#283
jonatanklosko
closed
7 months ago
0
Transfer serving computation result to binary backend upfront
#282
jonatanklosko
closed
7 months ago
0
[in-progress] Add credo to project for better code-readability and performance fixes
#281
imahmedismail
closed
7 months ago
3
Add op-name to rms norm
#280
seanmor5
closed
7 months ago
1
Previous
Next