Closed seshubonam closed 1 year ago
Sure!
Example for !gpt
command, the logic is:
!gpt
and extract user's prompt
https://github.com/hibobmaster/matrix_chatgpt_bot/blob/ea3a2cc98fa7807407f2438385618e2530ddb31c/bot.py#L219-L221In short: You get the response from
output = query({
"question": "Hey, how are you?",
})
extract the content you want from json, then send back to user by simply call send_room_message
Can you tell me which api endpoints do you use, so i can test it.
thanks for the quick response. im using flowise which is a ui for langchain. can build almost anything thats possible with langchainJS. even babyagi and autogpt can be built on flowise with langchain JS backend and a drag and drop ui. output is an api in python,js, curl or an embeddable chatbot. https://github.com/FlowiseAI/Flowise
I have tested locally, can you verify it?
https://github.com/hibobmaster/matrix_chatgpt_bot/commit/600c5e161a1e7c5e681ab217ebbbefc74f3b2cfc
Pull the latest docker image
If you are using config.json, add flowise_api_url
If you are using .env, add FLOWISE_API_URL
https://github.com/hibobmaster/matrix_chatgpt_bot/wiki/Langchain-(flowise)
will update soon, thanks for the awesome work
@seshubonam The current Flowise API does not support context conversation. I will temporarily set it aside and plan to implement session isolation in the future once Flowise supports context conversation.
sure thing! thanks
can this implementation be used as a template to build bots from other api endpoints like ones I could generate from langchain agents. for instance i have an api endpoint like this. wondering how can I integrate it as a matrix bot:
import requests
API_URL = "http://localhost:3000/api/v1/prediction/905e59a8-9958-4d1b-b83c-1240269861c5"
def query(payload): response = requests.post(API_URL, json=payload) return response.json()
output = query({ "question": "Hey, how are you?", })