Open efiShtain opened 1 month ago
Hi @efiShtain, this is by far the most requested feature in the community :) We are currently figuring out how to solve this exactly without loosing the capabilities of staying in native TF/OpenTofu environment. Stay tuned, we have this as a top priority on our roadmap!
I'd love to have something like import statement for a stack the import would provide everything the stack has like id, name, description, tags and also an outputs list
outputs list should be declarative with a way to define the value of the output, it can be static or dynamic
static can be any number / string / whatever and dynamic should point to a tf output so it will run tofu output <name>
actual example
// Imported Stack Definition
stack {
name = "staging ecs cluster"
description = "staging ecs cluster"
tags = ["staging", "cluster"]
id = "imported_stack_id"
outputs = [
"static: this is a string",
"dynamic: arn" // this will run ```tofu output arn```
]
}
// Importing Stack Definition
stack {
name = "ecs service"
tags = ["staging", "service"]
id = "..."
}
// I call this a named import
import "ecs_cluster" {
source = "imported_stack_id"
}
// more concise with current import, not sure I like it
import {
source = "imported_stack_id"
as = "ecs_cluster"
}
generated_hcl "_service.tf" {
content {
resource "aws_ecs_service" "service" {
// .... lots of stuff
cluster_id = import.ecs_cluster.outputs.id
}
}
}
Is your feature request related to a problem? Please describe. I would like to use outputs between different states to manage my infrastructure
Describe the solution you'd like for example, assuming I have 2 stacks, google storage and google function google storage generates a bucket with a random name and set it as output in terraform
google function requires the bucket name, so I would like to use the google storage stack output as input for function stack
Describe alternatives you've considered Sometimes it is possible to set a global in a higher hierarchy level to share the data, but it doesn't always work for example when data is generated like random or by the cloud resource creating itself (only known after creation)
I though about running a script before / after the stack runs to get the outputs using native terraform, didn't try it yet, seems fragile