ollama / ollama-python

Ollama Python library
https://ollama.com
MIT License
2.71k stars 223 forks source link

Use with personal context #95

Open Victordeleusse opened 3 months ago

Victordeleusse commented 3 months ago

Hello, Trying to implement a way to question PDFs locally and get answers only based on data from the docs. I have already find a way to embed the data into a vector db (using Chroma) and then retrieve with a "similarity_search" the most relevant data from our query into the doc. I would like now to find a way to give to my model this context to generate answer on it, maybe by using a prompt into the generate call ?


query = "What is the date of the start of the battle ?"
    docs = db.similarity_search(query)
    print(docs[0].page_content)

    llm = Ollama(
        model=llm_model_name,
        callbacks=[StreamingStdOutCallbackHandler()],
    )
    my_retriever = db.as_retriever(search_kwargs={"k": 8})

    response = ollama.generate(
        "model": llm_model_name,

    )``` 
Thank you for your help !
chandansp27 commented 2 months ago

Refer to this blog, it has a basic code example that can help your issue, feel free to reach out if you need further help.

https://ollama.com/blog/embedding-models