LLocalSearch is a completely locally running search aggregator using LLM Agents. The user can ask a question and the system will use a chain of LLMs to find the answer. The user can see the progress of the agents and the final answer. No OpenAI or Google API keys are needed.
Is your feature request related to a problem? Please describe.
Looks like the current networking (using different interfaces and bridges) is not universally supported (kinda of) and just confusing for a lot of people without docker experience.
Describe the solution you'd like
Just use the default approach, using a lot of port mapping
Is your feature request related to a problem? Please describe. Looks like the current networking (using different interfaces and bridges) is not universally supported (kinda of) and just confusing for a lot of people without docker experience.
Describe the solution you'd like Just use the default approach, using a lot of port mapping
Additional context Depends on a solution for #21