Closed kevinlu1248 closed 7 months ago
None
)[!TIP] I can email you next time I complete a pull request if you set up your email here!
I found the following snippets in your repository. I will now analyze these snippets and come up with a plan.
docs/pages/usage/extra-self-host.mdx
✓ https://github.com/sweepai/sweep/commit/fd0640ea99307189481685bfc242f603887bc808 Edit
Modify docs/pages/usage/extra-self-host.mdx with contents:
• Remove the section starting from the header "### 4. Test other LLMs" down to the code block that sets the OPENAI_API_BASE and OPENAI_API_KEY environment variables. This includes the header, the LiteLLM installation and setup instructions, and the environment variable setup instructions.
• Specifically, delete lines 35 through 65 from the extra-self-host.mdx file.
• Ensure that the removal of this section does not affect the formatting or flow of the surrounding content. If necessary, adjust the surrounding text to maintain a coherent structure in the document.
--- +++ @@ -33,34 +33,6 @@ ``` OPENAI_DO_HAVE_32K_MODEL_ACCESS=true ``` -### 4. Test other LLMs -#### Huggingface, Palm, Ollama, TogetherAI, AI21, Cohere etc.[Full List](https://docs.litellm.ai/docs/providers) -##### Create OpenAI-proxy -We'll use [LiteLLM](https://docs.litellm.ai/docs/) to create an OpenAI-compatible endpoint, that translates OpenAI calls to any of the [supported providers](https://docs.litellm.ai/docs/providers). -Example to use a local CodeLLama model from Ollama.ai with Sweep: - -Let's spin up a proxy server to route any OpenAI call from Sweep to Ollama/CodeLlama -```python -pip install litellm -``` -```python -$ litellm --model ollama/codellama - -#INFO: Ollama running on http://0.0.0.0:8000 -``` - -[Docs](https://docs.litellm.ai/docs/proxy_server) - -### Update Sweep - -Update your .env - -```shell -os.environ["OPENAI_API_BASE"] = "http://0.0.0.0:8000" -os.environ["OPENAI_API_KEY"] = "my-fake-key" -``` - -Note: All of Sweep's testing has been done on GPT-4. We've tested and on most of our prompts, even GPT-3.5 and Claude v2 breaks most of the time.
docs/pages/usage/extra-self-host.mdx
✓ Edit
Check docs/pages/usage/extra-self-host.mdx with contents:
Ran GitHub Actions for fd0640ea99307189481685bfc242f603887bc808:
I have finished reviewing the code for completeness. I did not find errors for sweep/remove_the_litellm_section_in_httpsdocss
.
💡 To recreate the pull request edit the issue title or description. Something wrong? Let us know.
This is an automated message generated by Sweep AI.
Branch
No response
Checklist
- [X] Modify `docs/pages/usage/extra-self-host.mdx` ✓ https://github.com/sweepai/sweep/commit/fd0640ea99307189481685bfc242f603887bc808 [Edit](https://github.com/sweepai/sweep/edit/sweep/remove_the_litellm_section_in_httpsdocss/docs/pages/usage/extra-self-host.mdx) - [X] Running GitHub Actions for `docs/pages/usage/extra-self-host.mdx` ✓ [Edit](https://github.com/sweepai/sweep/edit/sweep/remove_the_litellm_section_in_httpsdocss/docs/pages/usage/extra-self-host.mdx)