ansible / awx

AWX provides a web-based user interface, REST API, and task engine built on top of Ansible. It is one of the upstream projects for Red Hat Ansible Automation Platform.
Other
13.7k stars 3.38k forks source link

[AWX 19.3.0] Playbook2 can not find the fetched file by Playbook1 #11262

Open zosmf-Robyn opened 2 years ago

zosmf-Robyn commented 2 years ago

I want to have a workflow job template which links 2 job templates. Each job template launch a playbook: playbook1 is used to fetch a file from remote host and save it to local path, such as "/tmp/remote_file.txt". playbook2 is used to read the fetched file and do something else.

I have installed AWX v14 on docker and deployed such workflow job template, which always works well. The fetched file was saved on awx_task container.

Now I install AWX v19 on OCP via awx-operator, and deployed such workflow job template. Since AWX v19 runs each job in a separate container, playbook1 is successful (which means the fetched file is successfully saved to "/tmp"), but playbook2 is failed with error: "FileNotFoundError: [Errno 2] No such file or directory: '/tmp/remote_file.txt' "

Using "set_stats" to pass result between playbooks is a workaround, but it is not applicable if the data to be passed to playbook2 is huge. For example, playbook2 is required to process thousands of files which is fetched by playbook1. Is it possible to have a shared volume for all containers of each AWX template job, so that these containers can access and share some specific file path?

chrismeyersfsu commented 2 years ago

Hey @zosmf-Robyn this exact flow isn't going to work anymore. Those 2 jobs are running in two different containers that get cleaned up after each run. You have options though. You can use some sort of shared persistent storage that you map into the container at runtime. Or maybe you can push the file to some 3rd party site (i.e. artifactory or a github gist even).

zosmf-Robyn commented 2 years ago

@chrismeyersfsu Thanks for reply! Could you please help to explain more about how to setup/use shared persistent storage and map into container? I'm new for awx-operator installation and not sure if I have the right setup, below is my yaml file:

apiVersion: awx.ansible.com/v1beta1
kind: AWX
metadata:
  name: awx-instance1
spec:
  service_type: nodeport
  ingress_type: Route
  postgres_storage_class: nfs-client
  extra_volumes: |
    - name: shared-volume
      persistentVolumeClaim:
         claimName: awx-instance1-projects-claim
  init_container_extra_volume_mounts: |
    - name: shared-volume
      mountPath: /shared
  init_container_extra_commands: |
      chmod 775 /shared
      chgrp 1000 /shared
  ee_extra_volume_mounts: |
    - name: shared-volume
      mountPath: /shared
  task_extra_volume_mounts: |
    - name: shared-volume
      mountPath: /shared
  web_extra_volume_mounts: |
    - name: shared-volume
      mountPath: /shared
chrismeyersfsu commented 2 years ago

@zosmf-Robyn try asking on the mailing list. They have more experience than I in actually running Tower :) Especially in kubernetes.

acelinkio commented 2 years ago

If the data being passed between plays is large, consider leveraging a data store. Mounting files really is not a scalable solution.

damluji commented 2 years ago

Also interested in how this is solved. especially if the volumes are in separate availability zones than the pods.

chrismeyersfsu commented 2 years ago

Personally, I would push any artifacts from one playbook up to s3, an ftp server, a gist or some cloud data store API. Then I would use set_stats to pass a url or identifier to the next job template in the workflow.

doyoungim999 commented 11 months ago

Anybody find a solution?