Closed jainamsethia closed 3 months ago
Same issues...is it because of windows?
24.03.31 11:39:46: root: INFO : /api/model-list GET 24.03.31 11:39:46: root: DEBUG : /api/model-list GET - Response: {"models":[["Claude 3 Opus","claude-3-opus-20240229"],["Claude 3 Sonnet","claude-3-sonnet-20240229"],["Claude 3 Haiku","claude-3-haiku-20240307"],["GPT-4 Turbo","gpt-4-0125-preview"],["GPT-3.5","gpt-3.5-turbo-0125"],["Gemini 1.0 Pro","gemini-1.0-pro"],["Gemini 1.5 Pro","gemini-1.5-pro"],["GROQ Mixtral","mixtral-8x7b-32768"],["GROQ LLAMA2-70B","llama2-70b-4096"],["GROQ GEMMA-7B-IT","gemma-7b-it"],["codellama","codellama:7b-code"],["mistral","mistral:latest"],["openhermes","openhermes:latest"]]}
24.03.31 11:39:46: root: INFO : /api/get-agent-state POST 24.03.31 11:39:46: root: DEBUG : /api/get-agent-state POST - Response: {"state":{"agent_is_active":true,"browser_session":{"screenshot":null,"url":null},"completed":false,"internal_monologue":null,"message":null,"step":null,"terminal_session":{"command":null,"output":null,"title":null},"timestamp":"2024-03-31 11:36:21","token_usage":326}}
24.03.31 11:39:46: root: INFO : /api/get-messages POST 24.03.31 11:39:46: root: DEBUG : /api/get-messages POST - Response: {"messages":[{"from_devika":false,"message":"create a snake game in python","timestamp":"2024-03-31 11:36:21"}]}
I notice that with Groq it works. Ollama is to slow I think on my computer. I didn't have GPU.
On Sun, Mar 31, 2024, 12:44 Sayan @.***> wrote:
Same issues...is it because of windows?
24.03.31 11:39:46: root: INFO : /api/model-list GET 24.03.31 11:39:46: root: DEBUG : /api/model-list GET - Response: {"models":[["Claude 3 Opus","claude-3-opus-20240229"],["Claude 3 Sonnet","claude-3-sonnet-20240229"],["Claude 3 Haiku","claude-3-haiku-20240307"],["GPT-4 Turbo","gpt-4-0125-preview"],["GPT-3.5","gpt-3.5-turbo-0125"],["Gemini 1.0 Pro","gemini-1.0-pro"],["Gemini 1.5 Pro","gemini-1.5-pro"],["GROQ Mixtral","mixtral-8x7b-32768"],["GROQ LLAMA2-70B","llama2-70b-4096"],["GROQ GEMMA-7B-IT","gemma-7b-it"],["codellama","codellama:7b-code"],["mistral","mistral:latest"],["openhermes","openhermes:latest"]]}
24.03.31 11:39:46: root: INFO : /api/get-agent-state POST 24.03.31 11:39:46: root: DEBUG : /api/get-agent-state POST - Response: {"state":{"agent_is_active":true,"browser_session":{"screenshot":null,"url":null},"completed":false,"internal_monologue":null,"message":null,"step":null,"terminal_session":{"command":null,"output":null,"title":null},"timestamp":"2024-03-31 11:36:21","token_usage":326}}
24.03.31 11:39:46: root: INFO : /api/get-messages POST 24.03.31 11:39:46: root: DEBUG : /api/get-messages POST - Response: {"messages":[{"from_devika":false,"message":"create a snake game in python","timestamp":"2024-03-31 11:36:21"}]}
— Reply to this email directly, view it on GitHub https://github.com/stitionai/devika/issues/266#issuecomment-2028619075, or unsubscribe https://github.com/notifications/unsubscribe-auth/AFCYH3TGKV3CSHUHT7Q42YDY27LHVAVCNFSM6AAAAABFPTTKL2VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDAMRYGYYTSMBXGU . You are receiving this because you commented.Message ID: @.***>
No space..
okay agree with Machiuka...very slow with Ollma. and Duckduck go. I used mistral that could also be the reason for its slowness...better to us a quantized model and dowload it to Ollama
Good idea!
On Sun, Mar 31, 2024, 15:40 Sayan @.***> wrote:
okay agree with Machiuka...very slow with Ollma. and Duckduck go. I used mistral that could also be the reason for its slowness...better to us a quantized model and dowload it to Ollama
— Reply to this email directly, view it on GitHub https://github.com/stitionai/devika/issues/266#issuecomment-2028703173, or unsubscribe https://github.com/notifications/unsubscribe-auth/AFCYH3UUFTKSILFOFDJHYADY2774XAVCNFSM6AAAAABFPTTKL2VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDAMRYG4YDGMJXGM . You are receiving this because you commented.Message ID: @.***>
fetch the latest changes. already fixed
24.03.30 20:24:39: root: DEBUG : /api/project-list GET - Response: {"projects":["test project"]}
24.03.30 20:24:39: root: INFO : /api/model-list GET 24.03.30 20:24:39: root: DEBUG : /api/model-list GET - Response: {"models":[["Claude 3 Opus","claude-3-opus-20240229"],["Claude 3 Sonnet","claude-3-sonnet-20240229"],["Claude 3 Haiku","claude-3-haiku-20240307"],["GPT-4 Turbo","gpt-4-0125-preview"],["GPT-3.5","gpt-3.5-turbo-0125"],["Gemini 1.0 Pro","gemini-1.0-pro"],["Gemini 1.5 Pro","gemini-1.5-pro"],["GROQ Mixtral","mixtral-8x7b-32768"],["GROQ LLAMA2-70B","llama2-70b-4096"],["GROQ GEMMA-7B-IT","gemma-7b-it"]]}
24.03.30 20:24:39: root: INFO : /api/get-agent-state POST 24.03.30 20:24:39: root: DEBUG : /api/get-agent-state POST - Response: {"state":{"agent_is_active":true,"browser_session":{"screenshot":null,"url":null},"completed":false,"internal_monologue":null,"message":null,"step":null,"terminal_session":{"command":null,"output":null,"title":null},"timestamp":"2024-03-30 20:06:56","token_usage":327}}
24.03.30 20:24:39: root: INFO : /api/get-messages POST 24.03.30 20:24:39: root: DEBUG : /api/get-messages POST - Response: {"messages":[{"from_devika":false,"message":"write snake game in python using pygame","timestamp":"2024-03-30 20:06:55"},{"from_devika":false,"message":"continue","timestamp":"2024-03-30 20:17:33"}]}
Can someone please help!!
also just a suggestion can this be integrated in a website installing it and running it is very difficult