jina-ai / langchain-serve

⚡ Langchain apps in production using Jina & FastAPI
https://cloud.jina.ai
Apache License 2.0
1.61k stars 138 forks source link

[Bug] Calling a langchain function in a separate util file causes "Error: maximum recursion depth exceeded" #94

Closed jpzhangvincent closed 1 year ago

jpzhangvincent commented 1 year ago

I'm using the LLMSummarizationCheckerChain and put it into a separate util function. When I use this pattern of code and run lc-serve deploy local test_api, I got the error - ""Error: maximum recursion depth exceeded" when testing the endpoint with the example text - "Mammals can lay eggs, birds can lay eggs, therefore birds are mammals" in the Swagger doc.

# test_api.py
import os
from loguru import logger
from lcserve import serving
from test_util import get_fact_check

@serving
def get_fact_check(query: str, **kwargs) -> str:
    #logger.info("Query:", query)
    check_output = get_fact_check(query)
    return check_output
# test_util.py
import os
from langchain.chat_models import ChatOpenAI
from langchain.chains import LLMSummarizationCheckerChain

def get_fact_check(query):
    openai_api_key = os.environ['OPENAI_API_KEY']
    llm = ChatOpenAI(openai_api_key=openai_api_key, temperature=0)
    checker_chain = LLMSummarizationCheckerChain.from_llm(llm, max_checks=2, verbose=True)
    check_output = checker_chain.run(query)
    return check_output

Note that if I put the logic of the get_fact_check function into test_api.py - ,

# test_api.py
import os
from loguru import logger
from lcserve import serving
from langchain.chat_models import ChatOpenAI
from langchain.chains import LLMSummarizationCheckerChain
from langchain_utils import get_fact_check

@serving
def get_fact_check(query: str, **kwargs) -> str:
    logger.info("Query:", query)
    openai_api_key = os.environ['OPENAI_API_KEY']
    llm = ChatOpenAI(openai_api_key=openai_api_key, temperature=0)
    checker_chain = LLMSummarizationCheckerChain.from_llm(llm, max_checks=2, verbose=True)
    check_output = checker_chain.run(query)
    return check_output

somehow it's working, which is more confusing...

I'm wondering whether the langchain-serve has some internal scoping conflict with LLMSummarizationCheckerChain during the iteration.

Error:

Screenshot 2023-05-26 at 12 51 21 AM
deepankarm commented 1 year ago

Hey @jpzhangvincent, in the following code get_fact_check has a scope conflict. The name of import from test_util and function name under serving decorator are conflicting, hence the maximum recursion depth exceeded

# test_api.py
import os
from loguru import logger
from lcserve import serving
from test_util import get_fact_check

@serving
def get_fact_check(query: str, **kwargs) -> str:
    #logger.info("Query:", query)
    check_output = get_fact_check(query)
    return check_output

Please retry after changing it to

# test_api.py
import os
from loguru import logger
from lcserve import serving
from test_util import get_fact_check as _get_fact_check

@serving
def get_fact_check(query: str, **kwargs) -> str:
    #logger.info("Query:", query)
    check_output = _get_fact_check(query)
    return check_output
jpzhangvincent commented 1 year ago

Ah got it! Thanks!

jpzhangvincent commented 1 year ago

You can close the issue!