Syan-Lin / CyberWaifu

LLM + TTS 的真实感聊天机器人 | QQ 机器人 | 支持表情包、QQ 表情、联网搜索
MIT License
1.01k stars 117 forks source link

能否增加新的AI接口 #50

Open ChenJunzhe opened 11 months ago

ChenJunzhe commented 11 months ago

claude已经不免费了, chatgpt经常连不上

ShanJianSoda commented 4 months ago

我模仿着写了一个Kimi的,但是报错了

……
[2024-07-03 11:30:29,567][run_on_private_msg_0/ERROR] PyCqBot: 'OpenAI' object has no attribute 'get_num_tokens_from_messages'

这个是模仿GPT.py 的

from waifu.llm.Brain import Brain
from waifu.llm.VectorDB import VectorDB
from waifu.llm.SentenceTransformer import STEmbedding
from langchain.chat_models import ChatOpenAI
from langchain.embeddings import OpenAIEmbeddings
from typing import Any, List, Mapping, Optional
from langchain.schema import BaseMessage
from openai import OpenAI

class Kimi(Brain):
    def __init__(self,
                 api_key: str,
                 name: str
                 ):
        self.llm = OpenAI(
            api_key=api_key,
            base_url="https://api.moonshot.cn/v1",
        )
        # self.llm_nonstream = ChatOpenAI(openai_api_key=api_key, model_name=model)
        # self.embedding = OpenAIEmbeddings(openai_api_key=api_key)
        self.embedding = STEmbedding()
        self.vectordb = VectorDB(self.embedding, f'./memory/{name}.csv')

    def think(self, messages: List[BaseMessage]):
        # messages -> history
        history = []
        for message in messages:
            history.append(
                {'role': 'assistant' if message.type in ['ai', 'chat'] else 'user', "content": message.content})

        completion = self.llm.completions.create(
            model="moonshot-v1-8k",
            messages=history,
            temperature=0.3,
        )
        result = completion.choices[0].message.content
        return result

    def think_nonstream(self, messages: List[BaseMessage]):
        history = []
        for message in messages:
            history.append(
                {'role': 'assistant' if message.type in ['ai', 'chat'] else 'user', "content": message.content})

        completion = self.llm.completions.create(
            model="moonshot-v1-8k",
            messages=history,
            temperature=0.3,
        )
        result = completion.choices[0].message.content
        return result

    def store_memory(self, text: str | list):
        '''保存记忆 embedding'''
        self.vectordb.store(text)

    def extract_memory(self, text: str, top_n: int = 10):
        '''提取 top_n 条相关记忆'''
        return self.vectordb.query(text, top_n)

def main():
    msg = [BaseMessage()]

    llm = Kimi('sk-djaZQbBOnEuyLS31cr9HvXvcbyfxcRDStgLP8hKN4g6Mz9sN', 'ARONA')
    print(llm.think(msg))

if __name__ == "__main__":
    main()

如果作者有空,能否写一个Kimi的接口呢(Orz)