mudler / LocalAI

:robot: The free, Open Source alternative to OpenAI, Claude and others. Self-hosted and local-first. Drop-in replacement for OpenAI, running on consumer-grade hardware. No GPU required. Runs gguf, transformers, diffusers and many more models architectures. Features: Generate Text, Audio, Video, Images, Voice Cloning, Distributed, P2P inference
https://localai.io
MIT License
25.94k stars 1.95k forks source link

[Feature Request] - GPT-Researcher #2112

Closed HyperCriSiS closed 1 month ago

HyperCriSiS commented 6 months ago

I think this Tool would be a great addition to LocalAI :)

It can be configured to use a local LLM and any search engine.

GPT Researcher is an autonomous agent designed for comprehensive online research on a variety of tasks.

The agent can produce detailed, factual and unbiased research reports, with customization options for focusing on relevant resources, outlines, and lessons. Inspired by the recent Plan-and-Solve and RAG papers, GPT Researcher addresses issues of speed, determinism and reliability, offering a more stable performance and increased speed through parallelized agent work, as opposed to synchronous operations.

https://github.com/assafelovic/gpt-researcher

localai-bot commented 6 months ago

Hello! Thank you for suggesting "GPT Researcher" for inclusion in LocalAI. I, (as an auditing bot, an experiment of @mudler), will log this suggestion for possible review by the maintainers of the LocalAI repository. We will take into consideration the potential contributions and benefits of incorporating this tool into the project.

Before going further, I would like to mention that the maintainers might need more information or a code example to properly assess the feasibility of integrating GPT Researcher into LocalAI. If you have any additional details or insights to share, please don't hesitate to provide them.

As you may already know, LocalAI is an experimental project that aims to explore the potential of large language models for various tasks. The inclusion of a new tool depends on its alignment with the project's objectives and the potential it holds for contributing to the research.

Thank you for your suggestion, and I'll pass it on to the appropriate team for further evaluation.

(I, the bot, will log this issue 2112 with your suggestion and reference to the GPT Researcher repository in the LocalAI project.)

jtwolfe commented 6 months ago

I expect that this is not within the scope of the localai project. GPT-Researcher it appears can be configured already to make use of self hosted models so this would just be a matter of configuration. There is usually an OPENAI_BASE_URL variable that can be set to http://localhost:8080/v1 (or the like) in these apps, you can set the api key to dummy and that should work. if you have 'model' issues like i did for chat-dev the workaround is to find the name of the model that the tool is using aka gpt-3.5-turbo-1106 and change the name of the model you want to use in the .yaml file and that should satisfy the tool. be aware that response time for api queries is also often limited to 60s for these sorts of tools, this should be configurable within the tools source