Open FahaxikiHoney opened 1 month ago
How can I make the specific change? Is there a tutorial available?
The documentation provides examples and guidelines for switching to a different large model. Here is an example of how to use a locally hosted LLM:
from paperqa import Settings, ask
local_llm_config = dict(
model_list=[
dict(
model_name="llama3",
litellm_params=dict(
model="llama3",
api_base="http://localhost:8080/v1",
api_key="sk-no-key-required",
temperature=0.1,
frequency_penalty=1.5,
max_tokens=512,
),
)
]
)
answer = ask(
"What manufacturing challenges are unique to bispecific antibodies?",
settings=Settings(
llm="llama3",
llm_config=local_llm_config,
summary_llm="llama3",
summary_llm_config=local_llm_config,
),
)
Additionally, the documentation provides guidelines for changing the embedding model:
from paperqa import Settings, ask
answer = ask(
"What manufacturing challenges are unique to bispecific antibodies?",
settings=Settings(embedding="text-embedding-3-large"),
)
And for adding documents manually with a custom embedding model:
from paperqa import Docs, Settings
docs = Docs()
for doc in ("myfile.pdf", "myotherfile.pdf"):
docs.add(doc, settings=Settings(embedding="text-embedding-large-3"))
These examples illustrate how to configure and switch to different large models and embeddings in PaperQA2 [1].
To continue talking to Dosu, mention @dosu.
Do I need to download the large model locally?
Does this project support changing the large model to llama3?