Open msusol opened 3 months ago
Collision with datasets import created by the following assignment statement.
datasets
datasets = load_dataset("conll2003") ... tokenized_datasets = datasets.map(tokenize_and_align_labels, batched=True) ... train_set = model.prepare_tf_dataset( tokenized_datasets["train"], shuffle=True, batch_size=batch_size, collate_fn=data_collator, ) "TFDistilBertForTokenClassification requires the 🤗 Datasets library"
Fixed simply by..
conll = load_dataset("conll2003") ... tokenized_datasets = conll.map(tokenize_and_align_labels, batched=True)
Effects:
Collision with
datasets
import created by the following assignment statement.Fixed simply by..
Effects: