Additional context
In my case, this showed up as a large performance hit when migrating code to latest version. The performance penalty may be more subtle depending on the desired sequence length relative to the default value.
It seems the work around is to override the sequence length after initializing.
Describe the bug If I initialize a preprocessor from preset it does not respect the specified sequence length.
To Reproduce In keras-nlp== 0.11.1, the preprocessor defaults to 512 regardless of specified length:
Expected behavior In keras-nlp==0.8.2, the preprocess would respect specified length.
Additional context In my case, this showed up as a large performance hit when migrating code to latest version. The performance penalty may be more subtle depending on the desired sequence length relative to the default value.
It seems the work around is to override the sequence length after initializing.