benman1 / generative_ai_with_langchain

Build large language model (LLM) apps with Python, ChatGPT and other models. This is the companion repository for the book on generative AI with LangChain.
https://amzn.to/43PuIkQ
MIT License
552 stars 219 forks source link

LangChain: 1 validation error for LLMChain - value is not a valid dict (type=type_error.dict) #26

Closed raniat123 closed 4 months ago

raniat123 commented 4 months ago

Hello, I'm truly enjoying this book. Thank you for such a well written book! I was executing the code in chapter 3, page 86 (local models) but I get this error: LangChain: 1 validation error for LLMChain - value is not a valid dict (type=type_error.dict). It's related to the following line: llm_chain = LLMChain(prompt=prompt, llm=generate_text) . I found a a related post here. This post explians that the error was caused by a mistake in the import. Do you think it's the same problem in this situtation? here's what I executed!

from transformers import pipeline
import torch

generate_text= pipeline(
    model= "aisquared/dlite-v1-355m",
    torch_dtype=torch.bfloat16,
    trust_remote_code=True,
    device_map="auto",
    framework="pt",
)

Then this:

from langchain import PromptTemplate, LLMChain

template ="""Question: {question}
Answer: Let's think step by step"""
prompt = PromptTemplate(template=template, input_variables = ["question"])
llm_chain = LLMChain(prompt=prompt, llm=generate_text)
question ="what is electroencephalography?"
print(llm_chain.run(question))
benman1 commented 4 months ago

Hey @raniat123! First of all - I am delighted you like the book. About this issue - I am a bit confused why you closed this. Have you found the solution? If yes, do you mind sharing it here?

raniat123 commented 4 months ago

Hi Ben,

I closed the issue because I noticed someone else already posted about it. Sorry for the inconvenience!

No, I haven't found a solution yet!

Very nice book! I'm one of the people who likes to learn through books, I can't grasp information from video tutorials :) and I found your book to be very helpful!

On Fri, Feb 23, 2024 at 9:05 AM Ben Auffarth @.***> wrote:

Hey @raniat123 https://github.com/raniat123! First of all - I am delighted you like the book. About this issue - I am a bit confused why you closed this. Have you found the solution? If yes, do you mind sharing it here?

— Reply to this email directly, view it on GitHub https://github.com/benman1/generative_ai_with_langchain/issues/26#issuecomment-1961495502, or unsubscribe https://github.com/notifications/unsubscribe-auth/AFZNYGRULAXVC2YBXV7U5E3YVCV4FAVCNFSM6AAAAABDWCK7COVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTSNRRGQ4TKNJQGI . You are receiving this because you were mentioned.Message ID: @.***>

benman1 commented 4 months ago

Hi @raniat123 I think we've already talked about this on the discord server. The link looks very relevant, however, that doesn't seem to be the problem, as I can run the code without any issue. Let me take the discussion here, since this will be available for more people:

This could be a problem with the pydantic version that you have installed. Could you please check that it's version 1.10.13?

When I've just rerun the code, I've come across one issue as well, but unrelated - I had an issue with the LLM output. But after changing the llm like this:

llm = ChatOpenAI(model_name="gpt-3.5-turbo")

I got it to work. I've changed this in the information extraction module.