issues
search
bytedance
/
lightseq
LightSeq: A High Performance Library for Sequence Processing and Generation
Other
3.22k
stars
329
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
[CUDA][ERROR]: misaligned address
#329
fc20567
closed
2 years ago
6
Running examples meet error, with cuda 11.6
#328
FeixLiu
opened
2 years ago
7
how to use lightseq inference engine
#327
lyzKF
opened
2 years ago
7
I have trained vit model for classification.how to speed it by lightseq?tks
#326
henbucuoshanghai
opened
2 years ago
1
Transformer decoder triton server
#325
dave-rtzr
closed
2 years ago
1
module "lightseq" has no attribute "training"
#324
princelisjtu
opened
2 years ago
9
support trainable positional embedding
#323
aachong
closed
2 years ago
0
GPT2 training example
#322
panyan96
opened
2 years ago
1
Fix test unit
#321
hexisyztem
closed
2 years ago
0
[WIP] New arch
#320
hexisyztem
closed
2 years ago
0
请问这是cuda 越界问题
#319
13354236170
opened
2 years ago
3
模型转换问题:encode_output_project_kernel_kv
#318
13354236170
opened
2 years ago
1
add export and test for xglm, add extra_decode_length for gpt inference
#317
godweiyang
closed
2 years ago
0
hf bart training and inference
#316
aachong
closed
2 years ago
0
ModuleNotFoundError: No module named 'export'
#315
pkuwudi
opened
2 years ago
4
编译lightseq2.2.0版本,运行gpt2-large 例子报错
#314
BucherLi
closed
2 years ago
3
add encTdecT tagging (multilg_type=3) for multilingual translation
#313
xian8
closed
2 years ago
0
新增的Vit启动脚本没有用torch.distributed.launch,运行会报错
#312
h2bit
opened
2 years ago
1
from lightseq.training import export_pb2hdf5
#311
Shijihao
closed
2 years ago
2
如何单独使用Transformer encoder 和decoder infer
#310
13354236170
opened
2 years ago
1
Speed-up CLIP model training with LightSeq
#309
DenisKoposov
closed
2 years ago
0
Confused about cuda version, cuda 10.2 or cuda 11.6 are supported or not by lightseq inference
#308
lilyzlt
opened
2 years ago
2
LightSeq QAT
#307
godweiyang
closed
2 years ago
0
error in compilation: can't import protobuf header file when protobuf is installed in a different location.
#306
frankang
opened
2 years ago
2
modify Dockerfile to compile tritonbackend
#305
hexisyztem
closed
2 years ago
0
triton docker image build failed
#304
dave-rtzr
closed
2 years ago
3
publish tirtonserver image & update README
#303
hexisyztem
closed
2 years ago
0
a glitch fix in inference
#302
qmpzzpmq
opened
2 years ago
1
Triton backend rebase
#301
hexisyztem
closed
2 years ago
0
Triton backend rebase
#300
hexisyztem
closed
2 years ago
0
support ViT model
#299
zjersey
closed
2 years ago
0
[WIP] support Tritonserver-22.01
#298
hexisyztem
closed
2 years ago
0
请问lightseq能否转化translation的模型?
#297
DidaDidaDidaD
opened
2 years ago
2
为什么转换后的HDF5模型,推理时间反而比Hugging Face慢?
#296
DidaDidaDidaD
opened
2 years ago
1
How to use set different max_batch_tokens during inference?
#295
frankang
closed
2 years ago
2
win10 is anyone else succeed in running the examples?
#294
DidaDidaDidaD
opened
2 years ago
2
a good thing,but something need to modify,directory questions,you have to modify something force it work
#293
DidaDidaDidaD
opened
2 years ago
1
No module named 'lightseq_layers'
#292
DidaDidaDidaD
opened
2 years ago
1
Is post_ln=1 supported for inference?
#291
melody-rain
opened
2 years ago
2
请问下怎么使用fp16的类型做推理呀
#290
dingjingzhen
opened
2 years ago
6
请帮忙看下这两个问题
#289
kevinmgyu
opened
2 years ago
0
examples/inference/cpp transformer_example需要超大内存问题
#288
kevinmgyu
closed
2 years ago
2
unable to allocate memory in function AllocateCudaBuffersout of memoryE0331 16:32:37.715184 45 dynamic_batch_scheduler.cc:162]
#287
kevinmgyu
opened
2 years ago
8
运行Example中的ls_bert报错,TypeError: infer(): incompatible function arguments
#286
zhouyonglong
opened
2 years ago
2
support QAT, export and inference for quantized BERT, GPT2
#285
godweiyang
closed
2 years ago
0
output wrong
#284
bino282
opened
2 years ago
1
Support various sampling configuration (repetition_penalty, length_penalty etc)
#283
codertimo
opened
2 years ago
3
QuantTransformer inference error
#282
dearchill
closed
2 years ago
2
finalize
#281
zhr01
closed
2 years ago
0
Support MoE Inference
#280
zjersey
closed
2 years ago
0
Previous
Next