dbt-labs / dbt-spark

dbt-spark contains all of the code enabling dbt to work with Apache Spark and Databricks
https://getdbt.com
Apache License 2.0
395 stars 221 forks source link

Initial Conversion to GitHub Actions #719

Closed emmyoop closed 1 month ago

emmyoop commented 1 year ago

High Level Task

Following the pattern of the other adapters, create an integration.yml that contains the logic to replace the current CircleCI pattern.

Acceptance Criteria

  1. integration.yml exists on main in dbt-spark.
  2. It follows the patterns of the integration.yml files in other adapter repositories.
  3. It contains at least the stub of the functionality from CircleCI.
  4. Replaces fishtownanalytics/test-container:latest with setup within workflow
  5. It is not expected to be fully functional. Commenting out anything broken is fine. The workflow must exist in main to be able to fully test it. This will also allow the team to spread testing each connections among members.

Details

Leverage the GitHub migration tool to get the initial migration done. The goal of this tool is only to get you 80% of the way there.

You can do a dry run with the following command it stop it from automatically opening a PR that tells you to set up secrets:

gh actions-importer dry-run circle-ci --output-dir actions-importer/circle-ci-dry-run --circle-ci-project dbt-spark-release-test

The integration.yml in dbt-snowflake is a good reference point for what will need to be set up.

  1. triggers
  2. permissions
  3. concurrency
  4. metadata generation
  5. more?

Unknowns

Additional Notes

The repos should be set up in the settings to not allow the CI to run without explicit approval for forks. require-label-comment should no longer be necessary. Confirm this is true for spark w/ someone with admin access.

main.yml runs unit tests so they can be removed from this workflow.

dbeatty10 commented 1 month ago

Resolved by https://github.com/dbt-labs/dbt-spark/pull/923