Open ZephryLiang opened 1 week ago
@ZephryLiang Refer this documentation if you want to build your custom chat model. https://github.com/langchain-ai/langchain/blob/master/docs/docs/how_to/custom_chat_model.ipynb
The how to notebooks are accessible via the python docs site: https://python.langchain.com/v0.2/docs/how_to/custom_chat_model/
feel free to leave feedback as well through it
The how to notebooks are accessible via the python docs site: https://python.langchain.com/v0.2/docs/how_to/custom_chat_model/
feel free to leave feedback as well through it you are so nice! but it seems like difficult for me to write codes,could you give me some demos according to this blog.i want load local model by
.from_pretrained("mistralai/Mistral-7B-Instruct-v0.3"
@ZephryLiang Refer this documentation if you want to build your custom chat model. https://github.com/langchain-ai/langchain/blob/master/docs/docs/how_to/custom_chat_model.ipynb
thanks a lot ,but it is difficult for me to write a complete demo according to this blog,could you give me some demos.i want to use pretrained model to be used as a chat model
URL
https://python.langchain.com/v0.2/docs/tutorials/rag/
Checklist
Issue with current documentation:
https://python.langchain.com/v0.2/docs/tutorials/rag/#retrieval-and-generation-generate docs say any LangChain LLM or ChatModel could be substituted in.So where i can find a new model exclude methioned in the doc. i want to use local model. Like model = AutoModelForCausalLM.from_pretrained("mistralai/Mistral-7B-Instruct-v0.3", device_map="auto")
Idea or request for content:
as a beginner,i dont know the difference between ChatModel and model loading by from_pretrained,but one output right,another output error