Closed ccyjava closed 2 years ago
Thanks for the feedback! We are currently investigating and will update you shortly.
@ccyjava Thanks for the feedback!
@lgayhardt - Can you confirm the behavior with the pipelines PM and update the document as needed?
Sorry for the inconvenience. Currently v1 schedule only supports AzureBlobStorage. We will call out in document.
The doc has been update with a note. Thanks!
I try to create a change-based schedule but got the failure if I am using ADLS as the datastore.
datastore = Datastore(workspace=ws, name="adls_segrel11") reactive_schedule = Schedule.create(ws, name="MyReactiveSchedule", description="Based on input file change.", pipeline_id=pipeline_id, experiment_name=experiment_name, datastore=datastore, datapath parameter_name="local/AML_Temp/TTLShort/Trigger/data2")
Schedule.create(workspace, name, pipeline_id, experiment_name, recurrence, description, pipeline_parameters, wait_for_provisioning, wait_timeout, datastore, polling_interval, data_path_parameter_name, continue_on_step_failure, path_on_datastore, _workflow_provider, _service_endpoint) 374 if datastore is not None: 375 if datastore.datastore_type != 'AzureBlob': --> 376 raise ValueError('Datastore must be of type AzureBlobDatastore') 377 datastore_name = datastore.name 379 from azureml.pipeline.core._graph_context import _GraphContext
ValueError: Datastore must be of type AzureBlobDatastore
Document Details
⚠ Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.