Open dingjingzhen opened 2 years ago
我想问下lightseq内部是不是默认就是用的fp16做的推理呀,感觉数据不太对,能否开放源码直接编译inference,通过pip install -e .的形式把inference也做进去,方便调试。
pip install lightseq
should be inference by fp16.
On Fri, Apr 8, 2022 at 6:08 PM Jhin @.***> wrote:
我想问下lightseq内部是不是默认就是用的fp16做的推理呀,感觉数据不太对,能否开放源码直接编译inference,通过pip install -e .的形式把inference也做进去,方便调试。
— Reply to this email directly, view it on GitHub https://github.com/bytedance/lightseq/issues/290#issuecomment-1092698325, or unsubscribe https://github.com/notifications/unsubscribe-auth/AELIZAMFPIFOFD5C6LI2BADVEAARTANCNFSM5S4BK7RA . You are receiving this because you are subscribed to this thread.Message ID: @.***>
What do you mean? Is it the default fp16? I see your demo that comparison with transformers, transformers use fp32.Does it mean the comparison between the fp16 used by lightseq and the fp32 used by transformers?
That's right
On Tue, Apr 12, 2022 at 2:01 PM Jhin @.***> wrote:
What do you mean? Is it the default fp16? I see your demo that comparison with transformers, transformers use fp32.Does it mean the comparison between the fp16 used by lightseq and the fp32 used by transformers?
— Reply to this email directly, view it on GitHub https://github.com/bytedance/lightseq/issues/290#issuecomment-1096121237, or unsubscribe https://github.com/notifications/unsubscribe-auth/AELIZAJUH65WDGUUCGHI66TVEUGU7ANCNFSM5S4BK7RA . You are receiving this because you commented.Message ID: @.***>
You can check docs/inference/build.md
to build inference from source.
Ok, thanks for your reply, lightseq works well, but this problem really bothered me too, I'll try to build inference from source.Thanks again
请问下怎么使用fp16的类型做推理呀