Closed MushroomFleet closed 3 days ago
Mochi uses max_length 256 so that's expected, I know nothing about how to handle that best, auto truncating is an option, but personally I feel like error is best to let the user know their prompt would be truncated. For automated solutions the prompt generation should be limited and/or the outputs from the generators truncated.
Big Thanks !!
I'm generating prompts with Vision, and found that there is a 256 token limit, I've adjusted my Vision Prompt for use with i2v, however thought it worth reporting:
I know there are ways to split and concatenate long prompts, but that is above my pay grade :) I will try using prompt scheduling next to see if we can alter the prompt as we go through step count next.
again not sure if this is Mochi itself, but worth the report maybe :)
Thanks again !