A recipe that will walk you through using either Meta Llama 3.1 405B or GPT-4o deployed on Azure AI to generate a synthetic dataset using UC Berkeley's Gorilla project RAFT method.
MIT License
35
stars
11
forks
source link
Missing Add `learning_rate` param type in finetuning notebook + Mention that parameter types are necessary for papermill introspection #13