john-adeojo / custom_websearch_agent

Custom Websearch Agent Built with Local Models, vLLM, and OpenAI
MIT License
124 stars 55 forks source link

Groq api Integration #3

Open beezisback opened 3 months ago

beezisback commented 3 months ago

Can you also integrate Groq api? a request example can be def interact_with_groq_api(message_content): headers = { 'Content-Type': 'application/json', 'Authorization': f'Bearer {GROQ_API_KEY}' }

data = {
    'messages': [
        {
            'role': 'user',
            'content': message_content
        }
    ],
    'model': 'llama3-8b-8192',
    'temperature': 1,
    'max_tokens': 1024,
    'top_p': 1,
    'stream': False,
    'stop': None
}

response = requests.post(API_URL, headers=headers, data=json.dumps(data))

if response.status_code == 200:
    return response.json()
else:
    return {
        'error': f"Request failed with status code {response.status_code}",
        'details': response.text
    }

and an answer looks like this Enter your message: test { "id": "chatcmpl-3d79b5fe-9728-4c80-a26a-b6a0b539cce4", "object": "chat.completion", "created": 1716974410, "model": "llama3-8b-8192", "choices": [ { "index": 0, "message": { "role": "assistant", "content": "It looks like you're trying to test something! Is there something specific I can help you with or should I just say \"Hello!\"" }, "logprobs": null, "finish_reason": "stop" } ], "usage": { "prompt_tokens": 11, "prompt_time": 0.003151141, "completion_tokens": 27, "completion_time": 0.021647777, "total_tokens": 38, "total_time": 0.024798918 }, "system_fingerprint": "fp_af05557ca2", "x_groq": { "id": "req_01hz1tcp5kf3fr5zpprzyza69c" } }