Open bgrove-s7 opened 2 months ago
What is the current limit?
It appears to be 40,000 characters, per the following error:
CreateTunedModelRequest.tuned_model.tuning_task.training_data.examples.examples[9].text_input: text_input is too long. The maximum character count accepted is 40000.
Description of the feature request:
Increase character limitation in tuning jobs to take advantage of the Flash 1.5 1,000,000 token window.
What problem are you trying to solve with this feature?
Gemini 1.5 Flash has a very large token window, which potentially makes it ideal for extracting needles from haystacks of text. We would like to fine tune Gemini 1.5 to perform this task for us. Tuning is appropriate because the content we are typically examining is hundreds of thousands of tokens, leaving no room for multishot prompting techniques.
Any other information you'd like to share?
No response