langchain-ai / langchain

🦜🔗 Build context-aware reasoning applications
https://python.langchain.com
MIT License
92.28k stars 14.74k forks source link

Question about OpenAPI chain API auth #3190

Closed adamjq closed 9 months ago

adamjq commented 1 year ago

I'm trying to add a tool with the OpenAPI chain, and I'm struggling to get API auth working.

A bit about my use case:

I've tried the load_from_spec option but it reads the Base URL from the Open API spec. All the examples in the docs are for public, unauthenticated API calls as well.

I'd be happy to make a PR to update the docs if this functionality is supported but undocumented, or even try updating the OpenAPI tool if you can point me in the right direction.

ad3sai commented 1 year ago

@adamjq I have the same use case. Did you find any solution?

adamjq commented 1 year ago

@ad3sai I haven't tried again since raising this issue. I'm not sure if there's been a more recent update to address this

mbenachour commented 1 year ago

@adamjq facing the same issue, where able to local where the call is made in the library

ian-k commented 1 year ago

This supposed to work: APIOperation.from_openapi_spec has 'Requests' parameter. So all you need to do:

customRequests = Requests() customRequests.headers =[ ("Custom-Header","Custom-Value")]

APIOperation.from_openapi_spec(..., Requests=Requests)

hiradha commented 1 year ago

@adamjq Faced the same issue; Created my own static method for OpenapiEndpointchain See how the baseurl is changed and also how the request template is customized for Llama2.


class LLama2OpenAPIEndpointChain(OpenAPIEndpointChain):
    @classmethod
    def from_url_and_method(
            cls,
            spec_url: str,
            path: str,
            method: str,
            llm: BaseLanguageModel,
            requests: Optional[Requests] = None,
            return_intermediate_steps: bool = False,
            **kwargs: Any
            # TODO: Handle async
    ) -> "OpenAPIEndpointChain":
        """Create an OpenAPIEndpoint from a spec at the specified url."""
        operation = APIOperation.from_openapi_url(spec_url, path, method)
        operation.base_url=LLMConfig.OPENAI_API_BASE
        return cls.from_api_operation(
            operation,
            requests=requests,
            llm=llm,
            return_intermediate_steps=return_intermediate_steps,
            **kwargs,
        )

    @classmethod
    def from_api_operation(
            cls,
            operation: APIOperation,
            llm: BaseLanguageModel,
            requests: Optional[Requests] = None,
            verbose: bool = False,
            return_intermediate_steps: bool = False,
            raw_response: bool = False,
            callbacks: Callbacks = None,
            **kwargs: Any
            # TODO: Handle async
    ) -> "OpenAPIEndpointChain":
        """Create an OpenAPIEndpointChain from an operation and a spec."""
        param_mapping = _ParamMapping(
            query_params=operation.query_params,
            body_params=operation.body_params,
            path_params=operation.path_params,
        )
        requests_chain = LLama2APIRequesterChain.from_llm_and_typescript(
            llm,
            typescript_definition=operation.to_typescript(),
            verbose=verbose,
            callbacks=callbacks,
        )
        if raw_response:
            response_chain = None
        else:
            response_chain = APIResponderChain.from_llm(
                llm, verbose=verbose, callbacks=callbacks
            )
        _requests = requests or Requests()
        return cls(
            api_request_chain=requests_chain,
            api_response_chain=response_chain,
            api_operation=operation,
            requests=_requests,
            param_mapping=param_mapping,
            verbose=verbose,
            return_intermediate_steps=return_intermediate_steps,
            callbacks=callbacks,
            **kwargs,
        )

class LLama2APIRequesterChain(LLMChain):
    """Get the request parser."""

    @classmethod
    def from_llm_and_typescript(
            cls,
            llm: BaseLanguageModel,
            typescript_definition: str,
            verbose: bool = True,
            **kwargs: Any,
    ) -> LLMChain:
        """Get the request parser."""
        output_parser = APIRequesterOutputParser()
        prompt = PromptTemplate(
            template=LLAMA2_STANDARD_PROMPT_GENERATION_FORMAT.format(prompt=REQUEST_TEMPLATE),
            output_parser=output_parser,
            partial_variables={"schema": typescript_definition},
            input_variables=["instructions"],
        )
        return cls(prompt=prompt, llm=llm, verbose=verbose, **kwargs)
dosubot[bot] commented 9 months ago

Hi, @adamjq,

I'm helping the LangChain team manage their backlog and am marking this issue as stale. The issue you raised involves difficulties in implementing API authentication using the OpenAPI chain tool. Several users have shared their challenges and approaches, and there have been suggestions and a detailed example provided by another user. However, the issue remains unresolved.

Could you please confirm if this issue is still relevant to the latest version of the LangChain repository? If it is, please let the LangChain team know by commenting on the issue. Otherwise, feel free to close the issue yourself, or it will be automatically closed in 7 days.

Thank you for your understanding and contribution to LangChain!

adamjq commented 9 months ago

I haven't used Langchain recently so I'm not sure if the issue has been fixed in a more recent version. I'm closing the issue for now

dosubot[bot] commented 9 months ago

Thank you, @adamjq, for closing this issue! Your contribution is much appreciated.