EricFillion / happy-transformer

Happy Transformer makes it easy to fine-tune and perform inference with NLP Transformer models.
http://happytransformer.com
Apache License 2.0
517 stars 66 forks source link

Can i use t5 model to fine tune on text-to-code generation? #315

Open mdnihal29 opened 1 year ago

mdnihal29 commented 1 year ago

I want to train a model that generates code with natural language text as input..need some guidance on this also provide information on how the format of the dataset should be.

Thanks!

Sukii commented 1 year ago

I not an expert and have never used it for generating code but you can try training HappyTransformer with flan-t5 to see if you can fine-tuned it for that activity. You can probably create a new context like "code:" just like "grammar:" but you need a training dataset to start with.

Suki

On Thu, Mar 2, 2023 at 1:13 PM mdnihal29 @.***> wrote:

— Reply to this email directly, view it on GitHub https://github.com/EricFillion/happy-transformer/issues/315, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAI67K6NP23P4QNQ3N5J3CDW2BFQXANCNFSM6AAAAAAVNAQMFA . You are receiving this because you are subscribed to this thread.Message ID: @.***>