modelscope / agentscope

Start building LLM-empowered multi-agent applications in an easier way.
https://doc.agentscope.io/
Apache License 2.0
4.81k stars 294 forks source link
agent chatbot distributed-agents drag-and-drop gpt-4 gpt-4o large-language-models llama3 llm llm-agent multi-agent multi-modal

English | 中文

AgentScope

agentscope-logo

Start building LLM-empowered multi-agent applications in an easier way.

agentscope-workstation
Discord DingTalk

News

agentscope-logo agentscope-logo
Full News - **[2024-05-24]** We are pleased to announce that features related to the **AgentScope Workstation** will soon be open-sourced! The online website services are temporarily offline. The online website service will be upgraded and back online shortly. Stay tuned... - **[2024-05-15]** A new **Parser Module** for **formatted response** is added in AgentScope! Refer to our [tutorial](https://modelscope.github.io/agentscope/en/tutorial/203-parser.html) for more details. The [`DictDialogAgent`](https://github.com/modelscope/agentscope/blob/main/src/agentscope/agents/dict_dialog_agent.py) and [werewolf game](https://github.com/modelscope/agentscope/tree/main/examples/game_werewolf) example are updated simultaneously. - **[2024-05-14]** Dear AgentScope users, we are conducting a survey on **AgentScope Workstation & Copilot** user experience. We currently need your valuable feedback to help us improve the experience of AgentScope's Drag & Drop multi-agent application development and Copilot. Your feedback is valuable and the survey will take about 3~5 minutes. Please click [URL](https://survey.aliyun.com/apps/zhiliao/vgpTppn22) to participate in questionnaire surveys. Thank you very much for your support and contribution! - **[2024-05-14]** AgentScope supports **gpt-4o** as well as other OpenAI vision models now! Try gpt-4o with its [model configuration](./examples/model_configs_template/openai_chat_template.json) and new example [Conversation with gpt-4o](./examples/conversation_with_gpt-4o)! - **[2024-04-30]** We release **AgentScope** v0.0.4 now! - **[2024-04-27]** [AgentScope Workstation](https://agentscope.io/) is now online! You are welcome to try building your multi-agent application simply with our *drag-and-drop platform* and ask our *copilot* questions about AgentScope! - **[2024-04-19]** AgentScope supports Llama3 now! We provide [scripts](https://github.com/modelscope/agentscope/blob/main/examples/model_llama3) and example [model configuration](https://github.com/modelscope/agentscope/blob/main/examples/model_llama3) for quick set-up. Feel free to try llama3 in our examples! - **[2024-04-06]** We release **AgentScope** v0.0.3 now! - **[2024-04-06]** New examples [Gomoku](https://github.com/modelscope/agentscope/blob/main/examples/game_gomoku), [Conversation with ReAct Agent](https://github.com/modelscope/agentscope/blob/main/examples/conversation_with_react_agent), [Conversation with RAG Agent](https://github.com/modelscope/agentscope/blob/main/examples/conversation_with_RAG_agents) and [Distributed Parallel Optimization](https://github.com/modelscope/agentscope/blob/main/examples/distributed_parallel_optimization) are available now! - **[2024-03-19]** We release **AgentScope** v0.0.2 now! In this new version, AgentScope supports [ollama](https://modelscope.github.io/agentscope/en/tutorial/203-model.html#supported-models)(A local CPU inference engine), [DashScope](https://modelscope.github.io/agentscope/en/tutorial/203-model.html#supported-models) and Google [Gemini](https://modelscope.github.io/agentscope/en/tutorial/203-model.html#supported-models) APIs. - **[2024-03-19]** New examples ["Autonomous Conversation with Mentions"](https://github.com/modelscope/agentscope/blob/main/examples/conversation_with_mentions) and ["Basic Conversation with LangChain library"](https://github.com/modelscope/agentscope/blob/main/examples/conversation_with_langchain) are available now! - **[2024-03-19]** The [Chinese tutorial](https://modelscope.github.io/agentscope/zh_CN/index.html) of AgentScope is online now! - **[2024-02-27]** We release **AgentScope v0.0.1** now, which is also available in [PyPI](https://pypi.org/project/agentscope/)! - **[2024-02-14]** We release our paper "AgentScope: A Flexible yet Robust Multi-Agent Platform" in [arXiv](https://arxiv.org/abs/2402.14034) now!

What's AgentScope?

AgentScope is an innovative multi-agent platform designed to empower developers to build multi-agent applications with large-scale models. It features three high-level capabilities:

Supported Model Libraries

AgentScope provides a list of ModelWrapper to support both local model services and third-party model APIs.

API Task Model Wrapper Configuration Some Supported Models
OpenAI API Chat OpenAIChatWrapper guidance
template
gpt-4o, gpt-4, gpt-3.5-turbo, ...
Embedding OpenAIEmbeddingWrapper guidance
template
text-embedding-ada-002, ...
DALL·E OpenAIDALLEWrapper guidance
template
dall-e-2, dall-e-3
DashScope API Chat DashScopeChatWrapper guidance
template
qwen-plus, qwen-max, ...
Image Synthesis DashScopeImageSynthesisWrapper guidance
template
wanx-v1
Text Embedding DashScopeTextEmbeddingWrapper guidance
template
text-embedding-v1, text-embedding-v2, ...
Multimodal DashScopeMultiModalWrapper guidance
template
qwen-vl-max, qwen-vl-chat-v1, qwen-audio-chat
Gemini API Chat GeminiChatWrapper guidance
template
gemini-pro, ...
Embedding GeminiEmbeddingWrapper guidance
template
models/embedding-001, ...
ZhipuAI API Chat ZhipuAIChatWrapper guidance
template
glm-4, ...
Embedding ZhipuAIEmbeddingWrapper guidance
template
embedding-2, ...
ollama Chat OllamaChatWrapper guidance
template
llama3, llama2, Mistral, ...
Embedding OllamaEmbeddingWrapper guidance
template
llama2, Mistral, ...
Generation OllamaGenerationWrapper guidance
template
llama2, Mistral, ...
LiteLLM API Chat LiteLLMChatWrapper guidance
template
models supported by litellm...
Yi API Chat YiChatWrapper guidance
template
yi-large, yi-medium, ...
Post Request based API - PostAPIModelWrapper guidance
template
-

Supported Local Model Deployment

AgentScope enables developers to rapidly deploy local model services using the following libraries.

Supported Services

Example Applications

More models, services and examples are coming soon!

Installation

AgentScope requires Python 3.9 or higher.

Note: This project is currently in active development, it's recommended to install AgentScope from source.

From source

# Pull the source code from GitHub
git clone https://github.com/modelscope/agentscope.git

# Install the package in editable mode
cd agentscope
pip install -e .

Using pip

pip install agentscope

Extra Dependencies

To support different deployment scenarios, AgentScope provides several optional dependencies. Full list of optional dependencies refers to tutorial Taking distribution mode as an example, you can install its dependencies as follows:

On Windows

# From source
pip install -e .[distribute]
# From pypi
pip install agentscope[distribute]

On Mac & Linux

# From source
pip install -e .\[distribute\]
# From pypi
pip install agentscope\[distribute\]

Quick Start

Configuration

In AgentScope, the model deployment and invocation are decoupled by ModelWrapper.

To use these model wrappers, you need to prepare a model config file as follows.

model_config = {
    # The identifies of your config and used model wrapper
    "config_name": "{your_config_name}",          # The name to identify the config
    "model_type": "{model_type}",                 # The type to identify the model wrapper

    # Detailed parameters into initialize the model wrapper
    # ...
}

Taking OpenAI Chat API as an example, the model configuration is as follows:

openai_model_config = {
    "config_name": "my_openai_config",             # The name to identify the config
    "model_type": "openai_chat",                   # The type to identify the model wrapper

    # Detailed parameters into initialize the model wrapper
    "model_name": "gpt-4",                         # The used model in openai API, e.g. gpt-4, gpt-3.5-turbo, etc.
    "api_key": "xxx",                              # The API key for OpenAI API. If not set, env
                                                   # variable OPENAI_API_KEY will be used.
    "organization": "xxx",                         # The organization for OpenAI API. If not set, env
                                                   # variable OPENAI_ORGANIZATION will be used.
}

More details about how to set up local model services and prepare model configurations is in our tutorial.

Create Agents

Create built-in user and assistant agents as follows.

from agentscope.agents import DialogAgent, UserAgent
import agentscope

# Load model configs
agentscope.init(model_configs="./model_configs.json")

# Create a dialog agent and a user agent
dialog_agent = DialogAgent(name="assistant",
                           model_config_name="my_openai_config")
user_agent = UserAgent()

Construct Conversation

In AgentScope, message is the bridge among agents, which is a dict that contains two necessary fields name and content and an optional field url to local files (image, video or audio) or website.

from agentscope.message import Msg

x = Msg(name="Alice", content="Hi!")
x = Msg("Bob", "What about this picture I took?", url="/path/to/picture.jpg")

Start a conversation between two agents (e.g. dialog_agent and user_agent) with the following code:

x = None
while True:
    x = dialog_agent(x)
    x = user_agent(x)
    if x.content == "exit":  # user input "exit" to exit the conversation_basic
        break

AgentScope Studio

AgentScope provides an easy-to-use runtime user interface capable of displaying multimodal output on the front end, including text, images, audio and video.

Refer to our tutorial for more details.

agentscope-logo

Tutorial

License

AgentScope is released under Apache License 2.0.

Contributing

Contributions are always welcomed!

We provide a developer version with additional pre-commit hooks to perform checks compared to the official version:

# For windows
pip install -e .[dev]
# For mac
pip install -e .\[dev\]

# Install pre-commit hooks
pre-commit install

Please refer to our Contribution Guide for more details.

Publications

If you find our work helpful for your research or application, please cite our papers.

  1. AgentScope: A Flexible yet Robust Multi-Agent Platform

    @article{agentscope,
        author  = {Dawei Gao and
                   Zitao Li and
                   Xuchen Pan and
                   Weirui Kuang and
                   Zhijian Ma and
                   Bingchen Qian and
                   Fei Wei and
                   Wenhao Zhang and
                   Yuexiang Xie and
                   Daoyuan Chen and
                   Liuyi Yao and
                   Hongyi Peng and
                   Ze Yu Zhang and
                   Lin Zhu and
                   Chen Cheng and
                   Hongzhu Shi and
                   Yaliang Li and
                   Bolin Ding and
                   Jingren Zhou}
        title   = {AgentScope: A Flexible yet Robust Multi-Agent Platform},
        journal = {CoRR},
        volume  = {abs/2402.14034},
        year    = {2024},
    }
  2. On the Design and Analysis of LLM-Based Algorithms

    @article{llm_based_algorithms,
        author  = {Yanxi Chen and
                   Yaliang Li and
                   Bolin Ding and
                   Jingren Zhou},
        title   = {On the Design and Analysis of LLM-Based Algorithms},
        journal = {CoRR},
        volume  = {abs/2407.14788},
        year    = {2024},
    }
  3. Very Large-Scale Multi-Agent Simulation in AgentScope

    @article{agentscope_simulation,
        author  = {Xuchen Pan and
                   Dawei Gao and
                   Yuexiang Xie and
                   Zhewei Wei and
                   Yaliang Li and
                   Bolin Ding and
                   Ji{-}Rong Wen and
                   Jingren Zhou},
        title   = {Very Large-Scale Multi-Agent Simulation in AgentScope},
        journal = {CoRR},
        volume  = {abs/2407.17789},
        year    = {2024},
    }