Open MorphSeur opened 3 weeks ago
I think it can be handled by defining custom API endpoint URL with API key. Together API endpoint is openai compatible thus should not cause any problems. Same goes for groq api.
Hello,
Thanks for your reply!
Can you please provide a Python code snippet to give a try of what you have proposed?
The problem is even with the current documentation, I got the following for CodeLlama-34b
that exists in this LiteLLM documentation:
NotFoundError: Together_aiException - Error code: 404 - {'error': {'message': 'Unable to access model togethercomputer/CodeLlama-34b. Please visit https://api.together.xyz to see the list
of supported models or contact the owner to request access.', 'type': 'invalid_request_error', 'param': None, 'code': 'model_not_found'}}
Is your feature request related to a problem? Please describe.
Hello!
The issue is related to the use of
Together AI
models, such asCodeLlama-34b
andLlama-3-70b-chat-hf
. Despite thatCodeLlama-34b
exists in this LiteLLM documentation, I got the following issue:Here is the issue related to
Llama-3-70b-chat-hf
:My question: is
Llama-3-70b-chat-hf
supported inOpen Interpreter
? if yes, can you please provide me with itsFunction Call
?Describe the solution you'd like
Integrate latest models of
Llama
inTogether AI
toOpen Interpreter
.Describe alternatives you've considered
No response
Additional context
No response