MeDott29 / textgrad

TextGrad: Automatic ''Differentiation'' via Text -- using large language models to backpropagate textual gradients.
http://textgrad.com/
MIT License
0 stars 0 forks source link

Sweep: Implementing Ollama configuration layer #1

Open MeDott29 opened 2 months ago

MeDott29 commented 2 months ago

given the following example implement a similarly functioning config layer for using ollama:

from autogen import AssistantAgent, UserProxyAgent

config_list = [
  {
    "model": "codellama",
    "base_url": "http://localhost:11434/v1",
    "api_key": "ollama",
  }
]

assistant = AssistantAgent("assistant", llm_config={"config_list": config_list})

user_proxy = UserProxyAgent("user_proxy", code_execution_config={"work_dir": "coding", "use_docker": False})
user_proxy.initiate_chat(assistant, message="Plot a chart of NVDA and TESLA stock price change YTD.")

References

Ollama Blog on Library Implementation

Continue.dev's method

Autogen's Method

sweep-ai[bot] commented 2 months ago
Sweeping

0%
💎 Sweep Pro: You have unlimited Sweep issues

Actions


[!TIP] To recreate the pull request, edit the issue title or description.