Closed ArslanKAS closed 7 months ago
π€
Hi @ArslanKAS! Good to see you again. I hope you're doing well. Let's dive into this deprecation warning issue you've encountered.
Based on the information you've provided, it seems like you've already replaced all instances of run
with invoke
in your code, which is the correct step to take in response to the deprecation warning. However, the warning is still appearing, which suggests that the deprecated __call__
method might be used somewhere else in your code or in the libraries you're using.
Unfortunately, without the exact location or the function that is causing the LangChainDeprecationWarning
, it's hard to provide a precise solution. The warning message usually includes the file and line number where the deprecated function is called. If you could provide that information, it would be easier to pinpoint the issue.
In the meantime, you can try to search for __call__
in your code and in the LangChain libraries you're using. If you find any instance of __call__
, try to replace it with invoke
or the appropriate method suggested in the deprecation warning.
Here's a general example of how you can do it:
# This is your original code
# obj is an instance of a class in LangChain
result = obj.__call__(arg1, arg2)
# This is how you can modify it
# alternative_call is the alternative function for __call__
result = obj.alternative_call(arg1, arg2)
Please replace alternative_call
with the actual alternative function for __call__
.
If you're still having trouble, please provide more information about the warning, such as the exact file and line number where the deprecated function is called. This will help in providing a more accurate solution.
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.
Help me be more useful! Please leave a π if this is helpful and π if it is irrelevant.
If you want to continue the conversation, start your reply with @dosu-bot.
I think you need to use invoke here as well:
summary_refine = refine_chain({"input_documents": docs}, return_only_outputs=True)
eg
summary_refine = refine_chain.invoke({"input_documents": docs})
Can you try that and let me know if that fixes it?
Seeing similar issue that worked for me.
@ArbazBamboowala try this
I think you need to use invoke here as well:
summary_refine = refine_chain({"input_documents": docs}, return_only_outputs=True)
eg
summary_refine = refine_chain.invoke({"input_documents": docs})
Can you try that and let me know if that fixes it?
Thank you so much Sir Harrison. That worked. Just wanted to ask if these changes would be happening very often because we're using LangChain a lot with OpenAI for our products and if LangChain is going to change the method calls etc every now and then, then please can you make it simpler so we may change it easily in the entire project? Thanks once again.
response = chain( {"input_documents":docs, "question": user_question} , return_only_outputs=True)
As Sir Harrison pointed out you may need to change this:
response = chain( {"input_documents":docs, "question": user_question} , return_only_outputs=True)
To this
response = chain.invoke( {"input_documents":docs, "question": user_question} )
response = chain( {"input_documents":docs, "question": user_question} , return_only_outputs=True)
As Sir Harrison pointed out you may need to change this:
response = chain( {"input_documents":docs, "question": user_question} , return_only_outputs=True)
To this
response = chain.invoke( {"input_documents":docs, "question": user_question} )
This is what I did also which worked for me. @ArbazBamboowala.
@ArbazBamboowala try this
Thank you so much, it worked for me :) Please can yall star my repositories. I am still looking for a job and would really help me with each one. I have been working hard on creating projects.. Thanks alot appreciate your help!!
@ArbazBamboowala try this
Worked like a charm, thanks! π
Well I got stuck on another issue, I added some login functionality and now I get blank page after running my streamlit application. It works without login code:
import streamlit as st
from PyPDF2 import PdfReader
from langchain.text_splitter import RecursiveCharacterTextSplitter
import os
from langchain_google_genai import GoogleGenerativeAIEmbeddings
import google.generativeai as genai
from langchain_community.vectorstores import FAISS
from langchain_google_genai import ChatGoogleGenerativeAI
from langchain.chains.question_answering import load_qa_chain
from langchain.prompts import PromptTemplate
from dotenv import load_dotenv
from googletrans import Translator
import pickle
from pathlib import Path
import streamlit_authenticator as stauth
# import pandas as pd
load_dotenv()
os.getenv("GOOGLE_API_KEY")
genai.configure(api_key=os.getenv("GOOGLE_API_KEY"))
def get_pdf_text(pdf_docs):
text=""
for pdf in pdf_docs:
pdf_reader= PdfReader(pdf)
for page in pdf_reader.pages:
text+= page.extract_text()
return text
def get_text_chunks(text):
text_splitter = RecursiveCharacterTextSplitter(chunk_size=1000, chunk_overlap=200)
chunks = text_splitter.split_text(text)
return chunks
def get_vector_store(text_chunks):
embeddings = GoogleGenerativeAIEmbeddings(model = "models/embedding-001")
vector_store = FAISS.from_texts(text_chunks, embedding=embeddings)
vector_store.save_local("faiss_index")
def get_conversational_chain():
prompt_template = """
Answer the question as detailed as possible from the provided context, make sure to provide all the details, if the answer is not in
provided context just say, "answer is not available in the context", don't provide the wrong answer\n\n
Context:\n {context}?\n
Question: \n{question}\n
Answer:
"""
model = ChatGoogleGenerativeAI(model="gemini-pro",
temperature=0.3)
prompt = PromptTemplate(template = prompt_template, input_variables = ["context", "question"])
chain = load_qa_chain(model, chain_type="stuff", prompt=prompt)
return chain
def user_input(user_question):
embeddings = GoogleGenerativeAIEmbeddings(model = "models/embedding-001")
translator = Translator()
new_db = FAISS.load_local("faiss_index", embeddings)
docs = new_db.similarity_search(user_question)
chain = get_conversational_chain()
response = chain.invoke(
{"input_documents":docs, "question": user_question})
# print(response)
# st.write("Reply: ", response["output_text"])
response_text_english = response["output_text"]
response_text_original_language = translator.translate (response_text_english,
dest=translator.detect(response_text_english).lang).text
print(response_text_original_language)
st.write("Reply: ", response_text_original_language)
def main():
st.write("Debug Point 1")
st.set_page_config(page_title="Chat PDF", page_icon=':male-technologist:', initial_sidebar_state='collapsed')
# --- USER AUTHENTICATION ---
names = ["Peter Parker", "Rebecca Miller"]
usernames = ["pparker", "rmiller"]
# load hashed passwords
file_path = Path(__file__).parent / "hashed_dw.pkl"
with file_path.open("rb") as file:
hashed_passwords = pickle.load(file)
authenticator = stauth.Authenticate(names, usernames, hashed_passwords, "CHATBOT", "abcdef", cookie_expiry_days=30)
name, authentication_status, usernames = authenticator.login("Login", "main")
if authentication_status == False:
st.error("Username/password is incorrect")
if authentication_status == None:
st.warning("Please enter your username and password")
if authentication_status:
st.write("Debug Point 4")
st.header("Chat with PDF in Any Language you wantπ")
user_question = st.text_input("Ask a Question from the PDF Files")
st.write("Debug Point 5")
if user_question:
user_input(user_question)
with st.sidebar:
authenticator.logout("Logout Now", "sidebar")
st.sidebar.title(f"Welcome {name} to my chatbot")
pdf_docs = st.file_uploader("Upload your PDF Files to chat with me and Click on the Submit & Process Button", accept_multiple_files=True)
if st.button("Submit & Process, Thinking..."):
with st.spinner("Taking time for thinking..."):
raw_text = get_pdf_text(pdf_docs)
text_chunks = get_text_chunks(raw_text)
get_vector_store(text_chunks)
st.success("Done")
if __name__ == "__main__":
main()
I apologize if this shouldn't be posted, but I was having the same issue with a PDF reader application that didn't have the response=chain.invoke section, but it worked for mine under:
result = qa_chain.invoke({"query": query})
In case anyone stumbles on this thread (like myself!)
Issue you'd like to raise.
Everything was working fine but now suddenly I'm receiving all sorts of LangChain Deprecation issues. I installed the langchain_openai package and also installed langchain_community package too and replaced all the imports with the suggested ones in the error. It went well but now I'm stuck at this issue
The error is:
/home/ec2-user/anaconda3/envs/JupyterSystemEnv/lib/python3.10/site-packages/langchain_core/_api/deprecation.py:115: LangChainDeprecationWarning: The function
callwas deprecated in LangChain 0.1.0 and will be removed in 0.2.0. Use invoke instead. warn_deprecated(
In my code I've replaced all "run" to "invoke" but don't know why this warning is coming up. I'm also using a LangChain Summarizer and I checked the documentation and it's exactly the way it is in the documentation. I don't know how to get rid of that deprecation warning now. I don't want to suppress the warning, I want to resolve it so it won't cause any issue in the future.
This is the only code that I've related to LangChain: