openai-php / laravel

⚡️ OpenAI PHP for Laravel is a supercharged PHP API client that allows you to interact with OpenAI API
MIT License
2.82k stars 188 forks source link

Feature Request: Support for Chunk Ranking in File Search #121

Open aledc7 opened 1 month ago

aledc7 commented 1 month ago

I'm requesting support for chunk ranking in the file search tool when using openai-php/laravel. Currently, the file search returns all results it deems relevant, but this can lead to lower-quality responses if the model uses content with low relevance. It would be useful to adjust this behavior by enabling chunk ranking configuration in the file_search tool to ensure only highly relevant chunks are used.

The expected functionality would allow:

Inspecting file search chunks: Using parameters like include to retrieve the specific file chunks used during a response generation run.

Configurable chunk ranking: Adjusting settings like:

ranker: Which ranker to use, e.g., auto or default_2024_08_21. score_threshold: A value between 0.0 and 1.0, to filter file chunks based on their relevance score, improving the quality of responses. For example, in the OpenAI API, you can inspect the file chunks during a run as follows:

run_step = client.beta.threads.runs.steps.retrieve(
    thread_id="thread_abc123",
    run_id="run_abc123",
    step_id="step_abc123",
    include=["step_details.tool_calls[*].file_search.results[*].content"]
)

This feature would significantly enhance the precision of responses generated from file searches. It would be great if this could be incorporated into future releases.

Thank you!