run-llama / llama_index

LlamaIndex is a data framework for your LLM applications
https://docs.llamaindex.ai
MIT License
36.74k stars 5.27k forks source link

feat[confluence]: Permit passing params to Confluence client #16961

Open rehevkor5 opened 1 day ago

rehevkor5 commented 1 day ago

Description

The LlamaIndex Confluence reader doesn't today permit devs to make use of many of the parameters that the underlying Atlassian Confluence Python client supports. For example, timeout, backoff_and_retry, max_backoff_seconds, and such. This change adds a client_args parameter to enable that.

New Package?

Did I fill in the tool.llamahub section in the pyproject.toml and provide a detailed README.md for my new integration or package?

Version Bump?

Did I bump the version in the pyproject.toml file of the package I am updating? (Except for the llama-index-core package)

Type of Change

Please delete options that are not relevant.

How Has This Been Tested?

Your pull-request will likely not be merged unless it is covered by some form of impactful unit testing.

Suggested Checklist: