hashicorp / packer

Packer is a tool for creating identical machine images for multiple platforms from a single source configuration.
http://www.packer.io
Other
15.07k stars 3.33k forks source link

Pulling plugins from domains not github.com #11164

Open johnypony3 opened 3 years ago

johnypony3 commented 3 years ago

Community Note

Please vote on this issue by adding a 👍 reaction to the original issue to help the community and maintainers prioritize this request. Please do not leave "+1" or "me too" comments, they generate extra noise for issue followers and do not help prioritize the request. If you are interested in working on this issue or have submitted a pull request, please leave a comment.

Description

Allow pulling plugins from other vcs sites, in my case gitlab.

Use Case(s)

github is external to my build server and is therefore against company standards for use as an external dependency location

Potential configuration

packer {
  required_plugins {
    windows-update = {
      version = "0.12.0"
      source = "gitlab.company.com/mercedes/windows-update"
    }
  }
}

Potential References

This is documented as an eventual here: plugins source-addresses

SwampDragons commented 3 years ago

Hi, thanks for reaching out. This is definitely on our roadmap for something we want to do in the future, hence the "eventual" comment.

As a workaround, you can manually download plugins and place them into the plugins directory on your build server in order to make them discoverable by Packer without being downloaded. Full details can be found here: https://www.packer.io/docs/plugins (click the "manual (multi-component plugin)" tab)

tomsiewert commented 3 years ago

For us, we sometimes hit the GitHub rate limit on our CI runners which made us require to create mirrors of the used plugins. Those mirrors are located in Sub-Groups in our internal GitLab, but Packer force-validates / requires the source in a hostname/namespace/name format.

Error: Invalid plugin source string

  on debian/config.pkr.hcl line 7, in packer:
   7:     ansible = {
   8:       version = ">= 1.0.0"
   9:       source  = "git.company.com/mirror/github.com/hashicorp/ansible"
  10:     }

The "source" attribute must be in the format "hostname/namespace/name"
azr commented 3 years ago

Hello there, thanks for opening, Packer is meant to support other plugin getters than GitHub ! We just did not have the time to do it due to other priorities. If you would like to do that, I will try to give code pointers though, we would be more than happy to review PRs on this !

First, you will have to implement a new plugin getter, the GitHub one is being used here: https://github.com/hashicorp/packer/blob/0e3fcb589b98776a011fcfec9719fb66e9aba763/command/init.go#L85-L86 and being implemented in here:

https://github.com/hashicorp/packer/blob/f48583c57edf673ad1f141caaf1c5375e022da99/packer/plugin-getter/github/getter.go#L29-L32

The getter interface is defined here, I'd recommend reading it first :) :

https://github.com/hashicorp/packer/blob/54a4f59fc7e1ed480c54e9e1c70e33980e5986c5/packer/plugin-getter/plugins.go#L274-L323

ianblackshere commented 1 year ago

Hello, I am interested in working this issue. My team has a use-case for this where we would like to build on our private networks without having to copy the plugins to each host as we are spread across multiple sites.

johnypony3 commented 1 year ago

@ianblackshere we got around this by pulling the packages in and pulling them from local.

cliffchapmanrbx commented 2 days ago

👋🏻 Checking in mid-2024, sounds like the current workaround is still to use the plugins install command. Our CI systems are network isolated from the internet and can't download things from github.com, so we mirror through either our internal GitHub Enterprise Server or through our Artifactory server. Being able to put artifactory.example.com/hashi-tools/hashicorp/happycloud in our packer templates is nice!

To support the command packer plugins install --path ./packer-plugin-happycloud "artifactory.example.com/hashi-tools/hashicorp/happycloud" we must:

  1. Parse the packer template to extract the list of required plugins.
  2. Query our server for the available versions that we have cached.
  3. Re-implement the version constraint solver to determine appropriate versions.
  4. Download each version and extract it locally.
  5. Run the packer plugins install command for each one of those.

This feels like a lot of external work when the packer CLI has most of this context available internally. The updated docs linked in the closed PR mention that custom source addresses work with version pinning, however the packer plugins install command requires a path to an executable, not to a directory containing multiple versions of a plugin from which the version constraint solver can figure out an appropriate option.

Our team tries to provide generic solutions for other teams at our company to use, for example, a docker container pre-loaded with a packer build environment and the plugins we've mirrored preinstalled. I don't understand what the intended workflow is in this scenario other than to either hard-code multiple plugins install command on a per-template basis, or to build a pre-init script that reimplements packer's internal logic. This feels like a tall ask when the paid HCP Packer product doesn't even offer an alternative.

Back in 2021 azr commented above with some hints for implementing alternative source providers. Terraform uses go-getter for modules which is very flexible, and the less flexible filesystem mirror and http mirror providers for plugins since they involve more metadata. Either path would help bridge this gap.

Can Hashicorp folks provide any updated notes on submitting a PR to alleviate this gap in functionality? Are there plans to implement a pattern similar to terraform's mirroring, or something else? Should I avoid waiting and implement the pre-init system?