jgravelle / AutoGrok

17 stars 20 forks source link

Ollama Providor not correctly actioned #1

Open Photize opened 1 week ago

Photize commented 1 week ago

I ran Grok up and selected Ollama had this error ; "ModuleNotFoundError: No module named 'providers.ollamaprovider'" and a list of references I renamed the provider to ollamaprovider.py. it was found this error on its own. "Error retrieving available models: 'OllamaProvider' object has no attribute 'get_available_models'" I tried a simple ai command and got a heading reply . "AttributeError: st.session_state has no attribute "selected_model". Did you forget to initialize it? More info: https://docs.streamlit.io/library/advanced-features/session-state#initialization"

fiddled around for a while putting the model name in directly but Ive never used streamit so ill leave it there , good luck guys.

FOLLOW up Well Co pilot assisted and I managed a fudge which may help later here is the 1st few lines: START

import json import requests import streamlit as st

if 'selected_model' not in st.session_state: st.session_state.selected_model = "Phi3"
st.write(f"Selected model: {st.session_state.selected_model}") if 'temperature' not in st.session_state: st.session_state.temperature = 0.1 # Set an appropriate initial value st.write(f"Temperature: {st.session_state.temperature}")

from providers.base_provider import BaseLLMProvider

class OllamaProvider(BaseLLMProvider): def init(self, api_url, api_key=None): self.api_url = "http://127.0.0.1:11434/api/generate"

END as you can see it forces model and Temperature then we get some action :-)

Photize commented 1 week ago

it doesnt seem to maintain and randomly resets , I can presume the add ons are in the wrong file :)