LLocalSearch is a completely locally running search aggregator using LLM Agents. The user can ask a question and the system will use a chain of LLMs to find the answer. The user can see the progress of the agents and the final answer. No OpenAI or Google API keys are needed.
Apache License 2.0
5.67k
stars
362
forks
source link
Search cards hides LLM response in mobile browser #101
Describe the bug A clear and concise description of what the bug is.
To Reproduce
Expected behavior LLM response is shown in entirely, and not covered by search result cards.
Screenshots