Closed simplexityware closed 2 weeks ago
url = "https://api.groq.com/openai/v1/models" it is the correct URL if you are trying to get models
Can you provide more information about what you are trying to do? The base url is for converting applications built with the OpenAI SDK to groq.
For example, you can pass it to the OpenAI SDK as an environment variable:
OPENAI_BASE_URL="https://api.groq.com/openai/v1/"
Can you provide more information about what you are trying to do? The base url is for converting applications built with the OpenAI SDK to groq.
For example, you can pass it to the OpenAI SDK as an environment variable:
OPENAI_BASE_URL="https://api.groq.com/openai/v1/"
I am trying to replace Claude API with Groq/Llama3 API below https://dspy-docs.vercel.app/docs/deep-dive/language_model_clients/custom-lm-client
We are OpenAI compatible, but not anthropic compatible. I was able to get a client initialized as follows:
import dspy
import os
import json
client = dspy.OpenAI(
model="llama3-8b-8192",
api_base="https://api.groq.com/openai/v1/",
api_key=os.getenv("GROQ_API_KEY"),
model_type="chat",
)
print(client.basic_request(prompt="hello")["choices"][0]["message"]["content"])
Hello! It's nice to meet you. Is there something I can help you with, or would you like to chat?
We are OpenAI compatible, but not anthropic compatible. I was able to get a client initialized as follows:
import dspy import os import json client = dspy.OpenAI( model="llama3-8b-8192", api_base="https://api.groq.com/openai/v1/", api_key=os.getenv("GROQ_API_KEY"), model_type="chat", ) print(client.basic_request(prompt="hello")["choices"][0]["message"]["content"])
Hello! It's nice to meet you. Is there something I can help you with, or would you like to chat?
I see. I was using the custom LM client pattern recommended by DSPy. I will try your method real soon and let you know. Thank you a tons !
We are OpenAI compatible, but not anthropic compatible. I was able to get a client initialized as follows:
import dspy import os import json client = dspy.OpenAI( model="llama3-8b-8192", api_base="https://api.groq.com/openai/v1/", api_key=os.getenv("GROQ_API_KEY"), model_type="chat", ) print(client.basic_request(prompt="hello")["choices"][0]["message"]["content"])
Hello! It's nice to meet you. Is there something I can help you with, or would you like to chat?
Thank you. I have repeated the above successfully.
What is the name of the llama-3.1-405b model to use above?
Yeu
Sorry, but llama-3.1-405 is not currently available: https://console.groq.com/docs/models#llama-31-405b
Hi,
Is the following base url still valid?
HTTPError: 404 Client Error: Not Found for url: https://api.groq.com/openai/v1/
Thank you.
Simplexity