This PR (if it's been done right), should not break any existing functionality. It's adds a lot a new aliases for existing symbols, that going forward will be the main way we document them.
Classifier -> TextClassifier and XXClassifer -> XXTextClassifier.
Add new base classes keras_nlp.models.TextClassifierPrepreprocessor, keras_nlp.models.CausalLMPreprocessor and keras_nlp.models.MaskedLMPreprocessor.
There's no need to update if you don't want to, we will continue to to support the existing usages indefinitely. This is primarily to reduce confusion in our own docs; we want ImageClasssifier and TextClassifier to have obviously distinct usage.
This new base classes also enable a cross-model way of writing task code with split preprocessing.
preprocessor = keras_nlp.models.TextClassifierPrepreprocessor.from_preset(
"bert_base_en",
)
classifier = keras_nlp.models.TextClassifier(
"bert_base_en",
num_classes=2,
)
... run prepreprocessing and training separately
As a follow up, I think we can remove a lot of code by pushing common code onto base classes. But I will do that in a later PR to keep this from getting to big.
This PR (if it's been done right), should not break any existing functionality. It's adds a lot a new aliases for existing symbols, that going forward will be the main way we document them.
Classifier
->TextClassifier
andXXClassifer
->XXTextClassifier
.keras_nlp.models.TextClassifierPrepreprocessor
,keras_nlp.models.CausalLMPreprocessor
andkeras_nlp.models.MaskedLMPreprocessor
.There's no need to update if you don't want to, we will continue to to support the existing usages indefinitely. This is primarily to reduce confusion in our own docs; we want
ImageClasssifier
andTextClassifier
to have obviously distinct usage.This new base classes also enable a cross-model way of writing task code with split preprocessing.
As a follow up, I think we can remove a lot of code by pushing common code onto base classes. But I will do that in a later PR to keep this from getting to big.