Open raju11011967 opened 1 month ago
Are you using open llama ?
Yes, not working frustrating effort
G.Rajagopal
On Fri, 17 May 2024 at 10:54 PM, cafeTechne @.***> wrote:
Are you using open llama https://ollama.com/ ?
— Reply to this email directly, view it on GitHub https://github.com/stitionai/devika/issues/568#issuecomment-2118062693, or unsubscribe https://github.com/notifications/unsubscribe-auth/BIJBP46MVA4XK37TOTETEELZCY4NRAVCNFSM6AAAAABH4K3G3SVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDCMJYGA3DENRZGM . You are receiving this because you authored the thread.Message ID: @.***>
API key error, I've found, resolves when I enter the API keys in devika's config.toml file and make sure to remove the karrots (<,>) at either end of the API key and leave only the quotation marks at either end of the API key. This sorted the incorrect API keys error for me. Please also make sure you've copy/pasted the entire, accurate API key in the correctly labeled fields. I'm not sure about your 401 error. I think the version of python one installs in the venv is crucial for the front end to function. When I install any other version of python into the venv upon creation of the venv besides 3.10.1 - 3.10.9, the front end loaded errors out in some way. Vexing. I've used both uv and conda to create aforementioned virtual environments and personally have yet to get the locall llm to run through devika effectively or actually. Mine stays active on the back-end but produces no response on the front end. When in doubt, sometimes deleting the venv and creating a new one with a different python version, or creating a new one with the same version just so you can test if things installed properly across environments helps.
Ok, I'll check it out, also want to know any latest update of Devika repository
On Tue, 21 May 2024 at 10:49 PM, stollae @.***> wrote:
API key error, I've found, resolves when I enter the API keys in devika's config.toml file and make sure to remove the karrots (<,>) at either end of the API key and leave only the quotation marks at either end of the API key. This sorted the incorrect API keys error for me. Please also make sure you've copy/pasted the entire, accurate API key in the correctly labeled fields. I'm not sure about your 401 error. I think the version of python one installs in the venv is crucial for the front end to function. When I install any other version of python into the venv upon creation of the venv besides 3.10.1 - 3.10.9, the front end loaded errors out in some way. Vexing. I've used both uv and conda to create aforementioned virtual environments and personally have yet to get the locall llm to run through devika effectively or actually. Mine stays active on the back-end but produces no response on the front end. When in doubt, sometimes deleting the venv and creating a new one with a different python version, or creating a new one with the same version just so you can test if things installed properly across environments helps.
— Reply to this email directly, view it on GitHub https://github.com/stitionai/devika/issues/568#issuecomment-2123091411, or unsubscribe https://github.com/notifications/unsubscribe-auth/BIJBP46QARSS3QDWEX2HWULZDN63VAVCNFSM6AAAAABH4K3G3SVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDCMRTGA4TCNBRGE . You are receiving this because you authored the thread.Message ID: @.***>
so many help videos show devika is running very easily, but in reality it is not so, always getting error 401 and api key error, i have been searching for a solution but not finding it. Any guidance regarding the error would be appreciated. Thanks in advance