allenai / unifiedqa

UnifiedQA: Crossing Format Boundaries With a Single QA System
https://arxiv.org/abs/2005.00700
Apache License 2.0
426 stars 43 forks source link

Using Unified QA T5-large Saying ( Getting a Error ) Make Layer Stack Param Not found, #32

Closed ayush714 closed 2 years ago

ayush714 commented 2 years ago

I am getting this error, by using below code:-

ValueError: Configurable 'make_layer_stack' doesn't have a parameter named 'use_universal_transformer'.
  In file "gs://unifiedqa/models/large/operative_config.gin", line 83
    decoder/make_layer_stack.use_universal_transformer = False
    MODEL_SIZE = "large"
    BASE_PRETRAINED_DIR = "gs://unifiedqa/models/large"
    PRETRAINED_DIR = BASE_PRETRAINED_DIR
    MODEL_DIR = os.path.join(MODEL_DIR, MODEL_SIZE)

    model_parallelism, train_batch_size, keep_checkpoint_max = {
        "small": (1, 256, 16),
        "base": (2, 128, 8),
        "large": (8, 64, 4),
        "3B": (8, 16, 1),
        "11B": (8, 16, 1)}[MODEL_SIZE]
    tf.io.gfile.makedirs(MODEL_DIR)
    ON_CLOUD = False
    model = t5.models.MtfModel(
        model_dir=MODEL_DIR,
        tpu=None,
        model_parallelism=model_parallelism,
        batch_size=train_batch_size,
        sequence_length={"inputs": 128, "targets": 32},
        learning_rate_schedule=0.003,
        save_checkpoints_steps=5000,
        keep_checkpoint_max=keep_checkpoint_max if ON_CLOUD else None,
        iterations_per_loop=100,
    )
    FINETUNE_STEPS = 9

    logInfo("Started Training the model")
    start = time()
    model.finetune(
        mixture_or_task_name="qa_t5_meshs",
        pretrained_model_dir=PRETRAINED_DIR,
        finetune_steps=FINETUNE_STEPS
    )
    logInfo("Completed model training.", time_taken=time() - start)

```How I can fix this?

I have seen one answer in issues, but I don't know what you're trying to say.
danyaljj commented 2 years ago

https://github.com/google-research/text-to-text-transfer-transformer/issues/180

ayush714 commented 2 years ago

@danyaljj I have seen this so when I use 0.1.12 it give me another errors, and it is not resolving the issue!

ayush714 commented 2 years ago

I tried with most of the versions but still getting the same error!