Open EnzoAndree opened 3 months ago
Problem Description We can not use the new llama3.1 models on the groq API.
Proposed Solution Add the model as an option, and maybe add a custom model name
Additional Context Llama 3.1 405B (Preview) Model ID: llama-3.1-405b-reasoning Developer: Meta Context Window: 131,072 tokens Model Card Llama 3.1 70B (Preview) Model ID: llama-3.1-70b-versatile Developer: Meta Context Window: 131,072 tokens Model Card Llama 3.1 8B (Preview) Model ID: llama-3.1-8b-instant Developer: Meta Context Window: 131,072 tokens Model Card
Same feature request!
ME TOO!
+1 Hope to customize the model name
Please add this feature!
Problem Description We can not use the new llama3.1 models on the groq API.
Proposed Solution Add the model as an option, and maybe add a custom model name
Additional Context Llama 3.1 405B (Preview) Model ID: llama-3.1-405b-reasoning Developer: Meta Context Window: 131,072 tokens Model Card Llama 3.1 70B (Preview) Model ID: llama-3.1-70b-versatile Developer: Meta Context Window: 131,072 tokens Model Card Llama 3.1 8B (Preview) Model ID: llama-3.1-8b-instant Developer: Meta Context Window: 131,072 tokens Model Card