Tencent / TurboTransformers

a fast and user-friendly runtime for transformer inference (Bert, Albert, GPT2, Decoders, etc) on CPU and GPU.
Other
1.49k stars 198 forks source link

支持自己搭建的Transformer吗? #248

Open wapping opened 3 years ago

wapping commented 3 years ago

是不是仅支持README中提到的几个模型以下几个模型?假设基于pytorch搭建一个transformer模型,能用turbo transformer加速吗? BERT [Python] [C++] ALBERT [Python] Roberta [Python] Transformer Decoder [Python] GPT2 [Python]

feifeibear commented 3 years ago

怎么定义自己搭建transformer?只要是transformers构成的都模型,都可以复用我们写好的transformers接口 https://github.com/Tencent/TurboTransformers/blob/master/turbo_transformers/python/tests/bert_layer_test.py bert layer就是一个transformer结构