hashicorp / hcl2

Former temporary home for experimental new version of HCL
https://github.com/hashicorp/hcl
Mozilla Public License 2.0
374 stars 64 forks source link

Is dynamic walk of a file.Body possible without schema? #7

Closed fatih closed 6 years ago

fatih commented 6 years ago

Hi,

I have HCL file with custom blocks and user functions. I would like to walk the file to collect all variable references in the form of var.foo. These references can start with any kind of prefix, such as foo.bar, resource.bar, server.ip, etc.. This is needed so I can create a DAG to compute the references firsthand and pass the references later with a evalContext to each block (so decoding works and the expressions are called). It seems like the without knowing the structure of the file this is not possible? Here is an example which I'm trying to do (diags errors are neglect for the sake of simplicity):

var src = `
variable "url" {
 default = "https://httpbin.org/ip"
}

foo "bar" {
  method = "get"
  url    = var.url
}

bar "example" {
  result = "${foo.result}"
}
`

parser := hclparse.NewParser()
file, _ := parser.ParseHCL([]byte(src), "demo.hcl")

var raw map[string]interface{}
_ = gohcl.DecodeBody(file.Body, nil, &raw)

// doesn't work unfortunately
fmt.Printf("raw = %+v\n", raw)

Is there a way to walk the tree without knowing the structure and using Go structs to define it? Or maybe helper function that would allow me to collect all references?

Thanks

fatih commented 6 years ago

Seems like this is possible by asserting the body of the parsed file to *hclsyntax.Body. The docs are pointing to something else so I didn't know that. A basic example:

parser := hclparse.NewParser()
file, diags := parser.ParseHCLFile("demo.hcl")
if diags.HasErrors() {
    return diags
}

body, ok := file.Body.(*hclsyntax.Body)
if !ok {
    return errors.New("could not retrieve *hclsyntax.Body")
}

// iterate over all top level blocks
for _, block := range body.Blocks { }

// iterate over all  top level attributes
for _, attr := range body.Attributes { }

Opened a PR that updates the docs: https://github.com/hashicorp/hcl2/pull/8

apparentlymart commented 6 years ago

Hi @fatih! Nice to see you over here. 😀

Sorry things are still rather ill-documented in here. It's all still evolving and so I've not yet managed to write down a good overview of how it all fits together. The design philosophy here is quite a bit different than the prior HCL, and specifically more tailored to configuration with a well-defined structure, rather than arbitrary data structures.

As I think you've concluded, the main API in the hcl package requires schema-driven decoding. It's built this way so that we can robustly handle ambiguous syntaxes like JSON, where the schema allows us to distinguish between maps, lists of maps, and nested blocks. When parsing the "native syntax" (as implemented in hclsyntax), this ambiguity doesn't exist but we still use it to automatically report an error to the user if they use a block where an attribute is expected, or vice-versa.

Type-asserting to the hclsyntax concrete types is a reasonable thing to do if you need to do things that the hcl package abstractions won't allow, but note that in this case you're coupling yourself to the native syntax and what you did here won't work for the JSON syntax, and likely not for any other syntaxes that might be defined in future (depending on how they are implemented).

Totally dynamic parsing was not a goal I had in mind when defining the primary API here. The API design assumes applications will behave more like Terraform, having a generally-rigid structure but allowing arbitrary attributes in some specific blocks. The schema-driven parsing is key to the better-defined handling of JSON in HCL2, but indeed it does make some use-cases that were possible with HCL harder to achieve.

Sorry for the remaining references to "zcl" left in some spots. That's the name of my side-project that was forked and renamed to HCL2 to give a sense of continuity from HCL/HIL, and although all of the code is updated (I think!) there are still numerous leftover references in comments and docs which I'm cleaning up as I find them. Thanks for pointing out and fixing several of them!