Closed farbodtaymouri closed 12 months ago
it's definitely possible to assign compute for different components in the pipeline. You can assign compute to a component in the corresponding jobs sections of the pipeline YAML. here is an example: https://github.com/Azure/azureml-examples/blob/main/cli/jobs/pipelines-with-components/image_classification_with_densenet/pipeline.yml in this example, the train job will use gpu-cluster, other jobs in the piepline will use cpu-cluster, as defined in the default_compute under settings.
note default_compute is the default compute for the whole pipeline, if a step does not set a compute the default compute will be used. but if a step does set compute, the step level compute will be respected.
@farbodtaymouri
Thanks for your feedback! We will investigate and update as appropriate.
Hi @xiaoharper Thank you for your response.
@farbodtaymouri We are going to close this thread, if there are any further questions regarding the documentation, please tag me in your reply and we will be happy to continue the conversation.
I would like to inquire about assigning different components of a pipeline to separate clusters in Azure Machine Learning. Specifically, I have a two-component pipeline that includes a data preparation script, prep_data.py, and a training script, train.py. My intention is to run prep_data.py on a less expensive cluster and train.py on a GPU cluster for more intense computations.
However, I couldn't ascertain from the documentation if it's possible to designate these cluster assignments within the YAML files for each component. Could you clarify this for me?
Cheers,
Document Details
⚠ Do not edit this section. It is required for learn.microsoft.com ➟ GitHub issue linking.