Closed Shake-Shifter closed 4 months ago
Try "lmstudio" (lower case)...
-jjg
On Sun, Jun 2, 2024 at 2:45 AM Shake-Shifter @.***> wrote:
Long time listener first time caller. Just installed AutoGroq, set the config_local.py for LmStudio and received an error that Api_url couldn't be imported when I tried to run Autogroq. The following is the error:
ImportError: cannot import name 'API_URL' from 'config' (C:\Users\shake\Ag\Autogroq\AutoGroq\config.py) Traceback: File "C:\Users\shake.conda\envs\Ag\Lib\site-packages\streamlit\runtime\scriptrunner\script_runner.py", line 600, in _run_script exec(code, module.dict) File "C:\Users\shake\Ag\AutoGroq\AutoGroq\main.py", line 5, in from agent_management import display_agents File "C:\Users\shake\Ag\Autogroq\AutoGroq\agent_management.py", line 10, in from utils.ui_utils import get_llm_provider, regenerate_json_files_and_zip, update_discussion_and_whiteboard File "C:\Users\shake\Ag\Autogroq\AutoGroq\utils\ui_utils.py", line 13, in from config import API_URL, LLM_PROVIDER, MAX_RETRIES, MODEL_TOKEN_LIMITS, RETRY_DELAY
This is my config_local.py but all I changed was the llm provider
User-specific configurations
LLM_PROVIDER = "LMSTUDIO" GROQ_API_URL = "https://api.groq.com/openai/v1/chat/completions" LMSTUDIO_API_URL = "http://localhost:1234/v1/chat/completions" OLLAMA_API_URL = "http://127.0.0.1:11434/api/generate" OPENAI_API_KEY = "your_openai_api_key" OPENAI_API_URL = "https://api.openai.com/v1/chat/completions"
Not sure what I could be doing wrong. LmStudio is up and running as a server but I don't think that should really matter for the purpose of the error. Love what you're doin here by the way, keep it up.
— Reply to this email directly, view it on GitHub https://github.com/jgravelle/AutoGroq/issues/34, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAZ6GXCFR6TCFG4RQS4VFSTZFLETHAVCNFSM6AAAAABIU2ZOM2VHI2DSMVQWIX3LMV43ASLTON2WKOZSGMZDSNJRHA2TQMY . You are receiving this because you are subscribed to this thread.Message ID: @.***>
Long time listener first time caller. Just installed AutoGroq, set the config_local.py for LmStudio and received an error that Api_url couldn't be imported when I tried to run Autogroq. The following is the error:
ImportError: cannot import name 'API_URL' from 'config' (C:\Users\shake\Ag\Autogroq\AutoGroq\config.py) Traceback: File "C:\Users\shake.conda\envs\Ag\Lib\site-packages\streamlit\runtime\scriptrunner\script_runner.py", line 600, in _run_script exec(code, module.dict) File "C:\Users\shake\Ag\AutoGroq\AutoGroq\main.py", line 5, in
from agent_management import display_agents
File "C:\Users\shake\Ag\Autogroq\AutoGroq\agent_management.py", line 10, in
from utils.ui_utils import get_llm_provider, regenerate_json_files_and_zip, update_discussion_and_whiteboard
File "C:\Users\shake\Ag\Autogroq\AutoGroq\utils\ui_utils.py", line 13, in
from config import API_URL, LLM_PROVIDER, MAX_RETRIES, MODEL_TOKEN_LIMITS, RETRY_DELAY
LLM_PROVIDER = "LMSTUDIO" GROQ_API_URL = "https://api.groq.com/openai/v1/chat/completions" LMSTUDIO_API_URL = "http://localhost:1234/v1/chat/completions" OLLAMA_API_URL = "http://127.0.0.1:11434/api/generate" OPENAI_API_KEY = "your_openai_api_key" OPENAI_API_URL = "https://api.openai.com/v1/chat/completions"
Not sure what I could be doing wrong. LmStudio is up and running as a server but I don't think that should really matter for the purpose of the error. Love what you're doin here by the way, keep it up.