AIAnytime / Llama2-Medical-Chatbot

This is a medical bot built using Llama2 and Sentence Transformers. The bot is powered by Langchain and Chainlit. The bot runs on a decent CPU machine with a minimum of 16GB of RAM.
MIT License
296 stars 218 forks source link

I get double output if i use this code, dont know how to fix this? #13

Open shivamehta opened 11 months ago

shivamehta commented 11 months ago

image

This is the code copied from your repo :

@cl.on_chat_start async def start(): chain = qa_bot() msg = cl.Message(content="Starting the bot...") await msg.send() msg.content = "Hi, Welcome to School Assist Bot. What is your query?" await msg.update()

cl.user_session.set("chain", chain)

@cl.on_message async def main(message): chain = cl.user_session.get("chain") cb = cl.AsyncLangchainCallbackHandler( stream_final_answer=True, answer_prefix_tokens=["FINAL", "ANSWER"] ) cb.answer_reached = True res = await chain.acall(message, callbacks=[cb]) answer = res["result"] print(answer)

sources = res["source_documents"]

# if sources:
#     answer += f"\nSources:" + str(sources)
#     #pass
# else:
#     answer += "\nNo sources found"

await cl.Message(content=answer).send()
maksood5010 commented 10 months ago

I also got this issue. did anyone find any solution?

in the terminal i am getting this error:

Error in LangchainTracer.on_llm_end callback: ValueError('too many values to unpack (expected 2)')

@AIAnytime