Open nightpool opened 9 months ago
You should be able to set a relative working directory using environment.
ENV: TF_DIRECTORY: "./yourdirectory"
@srlynch1 won't that only upload the ./yourdirectory
folder then? So the ./modules
folder will be excluded. That's what I'm already doing and it has the exact same problem I'm trying to solve
Make your modules a sub folder if they are local modules, sub folders are automatically uploaded to the config version.
Think of a configversion as a zipped package, if you want to workaround this and your modules are in a parent you can checkout the repo folder that contains the modules to a sub folder of the current
@srlynch1 it's impossible to make your modules a subfolder if they're shared between two environments.
I don't quite understand the linkage to environments, but I also don't have visibility of what makes an environment in your case. Given your using TFE/TFC you could also use the private module registry, other option is using local modules via use the sub directory but separate vars for each environment your pipeline would select the relevant var files to upload in your pipeline in this this scenario.
@srlynch1 I think the usecase is with a structure as following:
❯ tree
.
├── dev
│ ├── certificates.tf
│ ├── cloudflare.tf
│ ├── dns.tf
│ ├── ruleset.tf
│ └── zone.tf
├── modules
│ └── base
│ ├── main.tf
│ ├── variables.tf
│ ├── worker-routes.tf
│ └── zone.tf
└── prod
├── cloudflare.tf
├── dns.tf
├── ruleset.tf
├── worker-route.tf
└── zone.tf
From what I've seen modules are often used to have consistent config applied to multiple envs, so as @nightpool points out in this case the module cannot live in a subfolder.
Edit: nevermind — in my case the solution was to simply not specify directory
and everything "just" works. Figured I'll leave my question below for others to find it that's useful to them.
I have a similar but different issue where I have my Terraform configuration files in a subdirectory infra
of my project. However, I want to upload the entire project directory, since it contains artifacts that Terraform will upload to S3. Something like:
.
|-- infra
| |-- main.tf
|-- dist
| |-- server-binary
where main.tf
includes a rule that upload ../dist/server-binary
.
In case it's relevant, I also have infra
set as the Terraform Working Directory in Terraform Cloud.
@srlynch1 I've arrived here looking for a solution to the same structural design @gerbyzation mentioned.
Mine looks like this...
.github
├── workflows
│ ├── compute-proxy-apply.yml
│ ├── compute-proxy-plan.yml
│
./infrastructure/fastly/compute/api
├── env
│ ├── dev
│ │ ├── inputs.tfvars
│ │ ├── main.tf
│ │ ├── provider.tf
│ │ └── variables.tf
│ ├── prd
│ │ ├── inputs.tfvars
│ │ ├── main.tf
│ │ ├── provider.tf
│ │ └── variables.tf
│ └── stg
│ ├── inputs.tfvars
│ ├── main.tf
│ ├── provider.tf
│ └── variables.tf
├── fastly.toml
├── main.go
├── modules
│ └── service-compute
│ ├── main.tf
│ ├── provider.tf
│ └── variables.tf
├── pkg
│ └── api.tar.gz
To run the Terraform manually I would have to (using 'dev' as an example):
cd ./infrastructure/fastly/compute/api/env/dev/
terraform init && terraform plan -var-file="inputs.tfvars"
So it's that flow I'm trying to understand how to mimick using GitHub Actions.
I had been following along with https://developer.hashicorp.com/terraform/tutorials/automation/github-actions but it only provides a basic structural example.
Has anyone (@gerbyzation maybe?) figured out a solution to this yet?
Thanks!
I'm trying to mimic the following behavior in Github actions to be able to use our local modules:
I do not see any way to accomplish this with the Github Actions workflow. Am I missing something? How would I upload a configuration version with a working directory of "staging" and a current directory of the repo root to TFC using this Github Action? Should I just set
directory: "."
?