Closed bmf7777 closed 21 hours ago
@bmf7777 you can add this: def handle_search_mode(search_engine, query): """Handles web search operations""" print(f"{Fore.CYAN}Initiating web search...{Style.RESET_ALL}") try:
results = search_engine.search_and_improve(query)
print(f"\n{Fore.GREEN}Search Results:{Style.RESET_ALL}")
print(results)
except Exception as e: logger.error(f"Search error: {str(e)}") print(f"{Fore.RED}Search failed: {str(e)}{Style.RESET_ALL}")
Add it before handle_research_mode function in Web_LLM.py
its implemented here: https://github.com/hafeezhmha/Automated-AI-Web-Researcher-Ollama.git
The reason that command doesn't work is because it's a left over part from my program I made that was the predecessor to this one, all it did was allowed for llama.cpp or ollama running LLMs to do internet searches via searching and then selecting pages to webscrape which is by the way basically was this version does but better, Here is the link to that if your interested, but honestly it's kind of not great: https://github.com/TheBlewish/Web-LLM-Assistant-Llamacpp-Ollama
I removed the command in the main startup screen for this program now after I realized it was still in there, sorry for the confusion!
when issuing /latest new on AI ... i get the above error message ... @ commands work fine