Open tazztone opened 2 months ago
It is ready support for Openai cloud. Just change api url and token key.
OpenAI setup example
✅successfully tested with openai gpt4mini, ✅grok also worked (even tho i also have no access in my region. somehow i still got an API key, but i forgot how exactly i requested it :D ):
❌google gemini failed... seems it needs oauth token:
PS: what would also be nice is some dropdown menu to switch between LMstudio, ollama, openai, grok, etc. so the API keys and URLs are saved there. (i know it can be done via export/import config feature already)
let me try try. https://console.groq.com/docs/quickstart
i check this api request page didnt see oauth setting.
dropdown menu is on the way as u wish later.
good morning. just to clarify: groq worked fine. it was gemini that didn't work. maybe it's my google settings idk. but i think groq models are better than gemini anyway. there is llama-3.2-90b-chat-preview new as of yesterday
Okie. So is goohle api call will fail right? Ill check accroding https://ai.google.dev/gemini-api/docs/api-key?hl=zh-tw
curl \
-H 'Content-Type: application/json' \
-d '{"contents":[{"parts":[{"text":"Explain how AI works"}]}]}' \
-X POST 'https://generativelanguage.googleapis.com/v1beta/models/gemini-1.5-flash-latest:generateContent?key=YOUR_API_KEY'
It seems like different format with openai lmstudio ollama, ill try call gemuni later.
maybe it's just some oauth authentication token issue.
dear @tazztone its not ur promblem about oauth issue, that is google has different from OpenAI request process. so i just rewrited. Now, it support google gemini-text and gemini-vision now.
well gemini seems a bit overly censored: :D
but anyway i tried to setup groq again.... did you maybe break groq support with your update?:
**WARNING:[auto-llm]:[][AutoLLM][getReq][Json]{'model': 'llama-3.1-70b-versatile', 'messages': [{'role': 'system', 'content': 'You are a text prompt enhancer for AI Image generation.\n'}, {'role': 'user', 'content': 'beautiful otherwordly place'}], 'max_tokens': 150, 'temperature': 0.5, 'top_p': 0.9, 'top_k': 8, 'stream': False}
WARNING:[auto-llm]:[][AutoLLM][getReq][Header]{'Content-Type': 'application/json', 'Authorization': 'Bearer APIKEY'}
WARNING:[auto-llm]:[Auto-LLM][][]Req URL=> https://api.groq.com/openai/v1/chat/completions
WARNING:[auto-llm]:[Auto-LLM][][]Server Ans=> {"error":{"message":"property 'top_k' is unsupported, did you mean 'top_p'?","type":"invalid_request_error"}}
WARNING:[auto-llm]:[Auto-LLM][][]Missing LLM Server?'choices'**
groq is amazing and deserves a preset as well :D
just tested openAI chatGPT API: it's broken as well. same error: "top_K invalid request"
and also getting this new warning now even when extension is disabled
"quick URL" button is good but it should change "LLM-URL", "API key", and "model name" with one click to save more time.
hi. adding support for cloud LLMs via API (chatGPT, Claude, Gemini, Grok) would be easy?
PS: thanks for the cool extension!