BlinkDL / RWKV-LM

RWKV is an RNN with transformer-level LLM performance. It can be directly trained like a GPT (parallelizable). So it's combining the best of RNN and transformer - great performance, fast inference, saves VRAM, fast training, "infinite" ctx_len, and free sentence embedding.
Apache License 2.0
12.32k stars 838 forks source link

License? #112

Closed djstrong closed 1 year ago

djstrong commented 1 year ago

Why do you claim this as Apache license? It is trained using e.g. Alpaca which is non commercial... Could you write in the Readme a section about license (preferably with licenses of used data/models)?

BlinkDL commented 1 year ago

RWKV-4 is trained using 100% Pile

djstrong commented 1 year ago

I am referring to (on top of the readme):

Raven 14B (finetuned on Alpaca+ShareGPT+...) Demo: https://huggingface.co/spaces/BlinkDL/ChatRWKV-gradio

Raven 7B (finetuned on Alpaca+ShareGPT+...) Demo: https://huggingface.co/spaces/BlinkDL/Raven-RWKV-7B