issues
search
abelriboulot
/
onnxt5
Summarization, translation, sentiment-analysis, text-generation and more at blazing speed using a T5 version implemented in ONNX.
Apache License 2.0
252
stars
30
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
Can this be used with Flan-T5?
#22
gianlucascoccia
opened
1 year ago
0
CVE-2007-4559 Patch
#21
TrellixVulnTeam
opened
2 years ago
0
Add dtype to new_tokens tensor to avoid an error when decoding
#20
jambran
opened
2 years ago
0
Repeat variables assignment both
#19
anapple00
opened
2 years ago
0
cpu only inferencing
#18
seekingdeep
closed
3 years ago
0
Running example "export_pretrained_model.py" as-is fails (See details)
#17
PrithivirajDamodaran
opened
3 years ago
3
Inference time on gpu vs onnxt5-gpu
#16
priyanksonis
closed
3 years ago
1
How to suppress output
#15
127
opened
3 years ago
0
quantized models
#14
marsupialtail
opened
3 years ago
1
fixed bug in GenerativeT5 class
#13
Ki6an
closed
3 years ago
0
Can this model suitable for multilingual-t5 accelerate?
#12
williamwong91
opened
3 years ago
2
Use OnnxRuntime IO Binding to improve GPU inference performance
#11
tianleiwu
opened
3 years ago
3
int() argument must be a string , when running exemple.
#10
AZE38
closed
3 years ago
3
Cosine similarity between embeddings
#9
ankitkr3
closed
3 years ago
7
Limit input ingestion to context length of the model
#8
abelriboulot
closed
4 years ago
0
Default T5 summary contains <extra_id_2>.<extra_id_3>.<extra_id_4>
#7
vladislavkoz
closed
4 years ago
5
Given model could not be parsed while creating inference session. Error message: Protobuf parsing failed.
#6
vladislavkoz
closed
4 years ago
6
Add progress bar
#5
brymck
closed
4 years ago
0
Add download progress bar
#4
brymck
closed
4 years ago
0
Build a progress bar for the download of the initial files of the model
#3
abelriboulot
closed
4 years ago
0
Implement beam search
#2
abelriboulot
opened
4 years ago
2
Fix packages
#1
brymck
closed
4 years ago
0