xusenlinzy / api-for-open-llm

Openai style api for open large language models, using LLMs just as chatgpt! Support for LLaMA, LLaMA-2, BLOOM, Falcon, Baichuan, Qwen, Xverse, SqlCoder, CodeLLaMA, ChatGLM, ChatGLM2, ChatGLM3 etc. 开源大模型的统一后端接口
Apache License 2.0
2.36k stars 270 forks source link

无法运行instruction.py #280

Open NCCurry30 opened 5 months ago

NCCurry30 commented 5 months ago

提交前必须检查以下项目 | The following items must be checked before submission

问题类型 | Type of problem

启动命令 | Startup command

操作系统 | Operating system

Linux

详细描述问题 | Detailed description of the problem

我希望运行codallama/instruction.py,但是我遇到了一个问题,请问我应该如何解决这个问题:

""" https://github.com/facebookresearch/codellama/blob/main/example_instructions.py """

from langchain.chat_models import ChatOpenAI
from langchain.schema import HumanMessage, SystemMessage

llm = ChatOpenAI(
    model_name="code-llama",
    openai_api_base="http://0.0.0.0:10810/v1",
    openai_api_key="xxx",
)

def test():
    instructions = [
        [
            HumanMessage(content="In Bash, how do I list all text files in the current directory (excluding subdirectories) that have been modified in the last month?")
        ],
        [
            HumanMessage(content="What is the difference between inorder and preorder traversal? Give an example in Python.")
        ],
        [
            SystemMessage(content="Provide answers in JavaScript"),
            HumanMessage(content="Write a function that computes the set of sums of all contiguous sublists of a given list.")
        ],
    ]

    for instruction in instructions:
        result = llm(instruction)
        for msg in instruction:
            print(f"{msg.type.capitalize()}: {msg.content}\n")

        print(
            f"> AI: {result.content}"
        )
        print("\n==================================\n")

if __name__ == "__main__":
    test()

image image 但是遇到了报错,server 502 image

Dependencies

# 请在此处粘贴依赖情况
# Please paste the dependencies here

运行日志或截图 | Runtime logs or screenshots

# 请在此处粘贴运行日志
# Please paste the run log here