LLocalSearch is a completely locally running search aggregator using LLM Agents. The user can ask a question and the system will use a chain of LLMs to find the answer. The user can see the progress of the agents and the final answer. No OpenAI or Google API keys are needed.
Is your feature request related to a problem? Please describe. Mac users (and probably other arm users) get some errors when running the containers
Describe the solution you'd like release containers also for arm