erfanzar / EasyDeL

Accelerate your training with this open-source library. Optimize performance with streamlined training and serving options with JAX. 🚀
https://easydel.readthedocs.io/en/latest/
Apache License 2.0
168 stars 19 forks source link

How to run llama in the examples #47

Closed zigzagcai closed 6 months ago

zigzagcai commented 7 months ago

Hi, I am trying to use EasyDel to democratize LLaMa, but I cannot find guide about how to run this model. Could you please give me some hint about the EasyDel LLaMa launch script, such as repo_id and dataset_name? Thanks!

erfanzar commented 7 months ago

yes for sure

Llama Models Serving Documents

and in case you want to pre-train fine-tune a model or maybe use RL-HF I'm creating a tutorial for that Right Now

erfanzar commented 7 months ago

here you go

Training Example

zigzagcai commented 7 months ago

here you go

Training Example

Thanks! Just another question about the dataset. I see in the docs dataset_train = load_dataset('REPO_ID_PATH_TO_DATASET'), but could you share an example of such REPO_ID_PATH_TO_DATASET?

erfanzar commented 7 months ago

yes here you can see an example for how to preprocess data for EasyDel Trainer but you need to install the library from pip+git cause I have added this right now and it's now supported via when you install package via pip install EasyDel

pip install git+https://github.com/erfanzar/EasyDeL.git
zigzagcai commented 7 months ago

I am sorry I still have some confusion. Sicne the readme indicates that flash_attn is implemented, I want to know how flash attention is implemented on llama. I know there is a jax implementation of flash_attn called flash-attention-jax, but I cannot find such codes that import flash_attn or implement it.

erfanzar commented 7 months ago

actually, EasyDel for some parts is only a Code Stand for the Backend of EasyDel

some process like

and many more being processed in FJFormer

and here's (EfficientAttention) what I guess you looking for

zigzagcai commented 7 months ago

I got it. Thanks @erfanzar ! You did awesome work!