Traceback (most recent call last):
File "ask-doc.py", line 76, in <module>
qa = RetrievalQA.from_chain_type(llm = model, chain_type = 'stuff', retriever = retriever)
File "/home/ia/.local/lib/python3.8/site-packages/langchain/chains/retrieval_qa/base.py", line 100, in from_chain_type
combine_documents_chain = load_qa_chain(
File "/home/ia/.local/lib/python3.8/site-packages/langchain/chains/question_answering/__init__.py", line 249, in load_qa_chain
return loader_mapping[chain_type](
File "/home/ia/.local/lib/python3.8/site-packages/langchain/chains/question_answering/__init__.py", line 73, in _load_stuff_chain
llm_chain = LLMChain(
File "/home/ia/.local/lib/python3.8/site-packages/langchain/load/serializable.py", line 74, in __init__
super().__init__(**kwargs)
File "pydantic/main.py", line 341, in pydantic.main.BaseModel.__init__
pydantic.error_wrappers.ValidationError: 1 validation error for LLMChain
llm
value is not a valid dict (type=type_error.dict)
Traceback (most recent call last):
File "ask-doc.py", line 77, in <module>
pipe = pipeline("text-generation", model=model, tokenizer=tokenizer, max_length=256, temperature=0.7, top_p=0.95, repetition_penalty=1.15)
File "/home/ia/.local/lib/python3.8/site-packages/transformers/pipelines/__init__.py", line 788, in pipeline
framework, model = infer_framework_load_model(
File "/home/ia/.local/lib/python3.8/site-packages/transformers/pipelines/base.py", line 281, in infer_framework_load_model
framework = infer_framework(model.__class__)
File "/home/ia/.local/lib/python3.8/site-packages/transformers/utils/generic.py", line 583, in infer_framework
raise TypeError(f"Could not infer framework from class {model_class}.")
TypeError: Could not infer framework from class <class 'hf_hub_ctranslate2.translate.GeneratorCT2fromHfHub'>.
When doing this:
I get this error:
When trying with a pipeline:
I get this: