LLocalSearch is a completely locally running search aggregator using LLM Agents. The user can ask a question and the system will use a chain of LLMs to find the answer. The user can see the progress of the agents and the final answer. No OpenAI or Google API keys are needed.
Apache License 2.0
5.67k
stars
362
forks
source link
Quick answer mode using search provider summaries #78
Is your feature request related to a problem? Please describe.
It currently takes an unreasonable long time to answer a simple question (questions which return something like a single value, or a binary answer).
Describe the solution you'd like
Use the small websites snippets surfaced by the search engines to give a preliminary result.
This result can then be refined using the entire website's context.
Is your feature request related to a problem? Please describe. It currently takes an unreasonable long time to answer a simple question (questions which return something like a single value, or a binary answer).
Describe the solution you'd like Use the small websites snippets surfaced by the search engines to give a preliminary result.
This result can then be refined using the entire website's context.