TheBlewish / Automated-AI-Web-Researcher-Ollama

A python program that turns an LLM, running on Ollama, into an automated researcher, which will with a single query determine focus areas to investigate, do websearches and scrape content from various relevant websites and do research for you all on its own! And more, not limited to but including saving the findings for you!
MIT License
1.01k stars 98 forks source link

Summary gets truncated #17

Closed gghfez closed 23 hours ago

gghfez commented 1 day ago

The tool ran for about 2 hours, checking 27 sources. It came to an accurate conclusion but then abruptly stopped here:

here are some potential benefits:

See screenshot:

image

This is my llama config, converted model with a longer context.

LLM_CONFIG_OLLAMA = {
    "llm_type": "ollama",
    "base_url": "http://localhost:11434",  # default Ollama server URL
    "model_name": "research-phi3:latest",  # Replace with your Ollama model name
    "temperature": 0.7,
    "top_p": 0.9,
    "n_ctx": 55000,
    "context_length": 55000,
    "stop": ["User:", "\n\n"]
}

Great tool by the way.

TheBlewish commented 1 day ago

thanks very much! Yeah not sure why it did that and truncated the summary if you have converted the model as per the github instructions with the modelfile.

But fortunately that is one of the reasons why I put in the conversation mode so, if you want you can try and ask it after the summary what the potential benefits are, maybe explain the context of what your asking a bit more, but it still has access to all the research in that mode so it should be able to give you what it should've originally? That's just my advice though, not quite sure why it got truncated.