sweepai / sweep

Sweep: open-source AI-powered Software Developer for small features and bug fixes.
https://sweep.dev
Other
7.45k stars 427 forks source link

Sweep: Remove the LiteLLM section in https://docs.sweep.dev/usage/extra-self-host#4-test-other-llms #3264

Closed kevinlu1248 closed 7 months ago

kevinlu1248 commented 7 months ago

Branch

No response

Checklist - [X] Modify `docs/pages/usage/extra-self-host.mdx` ✓ https://github.com/sweepai/sweep/commit/fd0640ea99307189481685bfc242f603887bc808 [Edit](https://github.com/sweepai/sweep/edit/sweep/remove_the_litellm_section_in_httpsdocss/docs/pages/usage/extra-self-host.mdx) - [X] Running GitHub Actions for `docs/pages/usage/extra-self-host.mdx` ✓ [Edit](https://github.com/sweepai/sweep/edit/sweep/remove_the_litellm_section_in_httpsdocss/docs/pages/usage/extra-self-host.mdx)
sweep-nightly[bot] commented 7 months ago

🚀 Here's the PR! #3265

See Sweep's progress at the progress dashboard!
💎 Sweep Pro: I'm using GPT-4. You have unlimited GPT-4 tickets. (tracking ID: None)

[!TIP] I can email you next time I complete a pull request if you set up your email here!


Actions (click)


Step 1: 🔎 Searching

I found the following snippets in your repository. I will now analyze these snippets and come up with a plan.

Some code snippets I think are relevant in decreasing order of relevance (click to expand). If some file is missing from here, you can mention the path in the ticket description. https://github.com/sweepai/sweep/blob/47caf9787d47e95299be7e85704499eb167bd3c1/docs/pages/usage/extra-self-host.mdx#L35-L65

Step 2: ⌨️ Coding

--- 
+++ 
@@ -33,34 +33,6 @@
 ```
 OPENAI_DO_HAVE_32K_MODEL_ACCESS=true
 ```
-### 4. Test other LLMs

-#### Huggingface, Palm, Ollama, TogetherAI, AI21, Cohere etc.[Full List](https://docs.litellm.ai/docs/providers)

-##### Create OpenAI-proxy
-We'll use [LiteLLM](https://docs.litellm.ai/docs/) to create an OpenAI-compatible endpoint, that translates OpenAI calls to any of the [supported providers](https://docs.litellm.ai/docs/providers).

-Example to use a local CodeLLama model from Ollama.ai with Sweep:
-
-Let's spin up a proxy server to route any OpenAI call from Sweep to Ollama/CodeLlama
-```python
-pip install litellm
-```
-```python
-$ litellm --model ollama/codellama
-
-#INFO: Ollama running on http://0.0.0.0:8000
-```
-
-[Docs](https://docs.litellm.ai/docs/proxy_server)
-
-### Update Sweep
-
-Update your .env
-
-```shell
-os.environ["OPENAI_API_BASE"] = "http://0.0.0.0:8000"
-os.environ["OPENAI_API_KEY"] = "my-fake-key"
-```
-
-Note: All of Sweep's testing has been done on GPT-4. We've tested and on most of our prompts, even GPT-3.5 and Claude v2 breaks most of the time.

Ran GitHub Actions for fd0640ea99307189481685bfc242f603887bc808:


Step 3: 🔁 Code Review

I have finished reviewing the code for completeness. I did not find errors for sweep/remove_the_litellm_section_in_httpsdocss.


🎉 Latest improvements to Sweep:
  • New dashboard launched for real-time tracking of Sweep issues, covering all stages from search to coding.
  • Integration of OpenAI's latest Assistant API for more efficient and reliable code planning and editing, improving speed by 3x.
  • Use the GitHub issues extension for creating Sweep issues directly from your editor.

💡 To recreate the pull request edit the issue title or description. Something wrong? Let us know.

This is an automated message generated by Sweep AI.