Closed Roman12322 closed 2 weeks ago
Would you consider adding a custom prompt from user for rag requests, like you just adding it to your LightRAG like below:
rag = LightRAG( working_dir=WORKING_DIR, llm_model_func=llm_model_func, rag_prompt = "SMTH_USER_WANT_HERE_TO_BE" embedding_func=EmbeddingFunc( embedding_dim=384, max_token_size=5000, func=lambda texts: hf_embedding( texts, tokenizer=AutoTokenizer.from_pretrained("sentence-transformers/all-MiniLM-L6-v2"), embed_model=AutoModel.from_pretrained("sentence-transformers/all-MiniLM-L6-v2") ) ), )
I would love to see it, cuz otherwise i have no other option but bringing LightRAG repo into my project =)
Hi, this is supported in this repo here. Just use the domain to customize the prompt
Would you consider adding a custom prompt from user for rag requests, like you just adding it to your LightRAG like below: