woodpecker-ci / woodpecker

Woodpecker is a simple, yet powerful CI/CD engine with great extensibility.
https://woodpecker-ci.org
Apache License 2.0
4.3k stars 371 forks source link

Conversion extension or pipeline modification "on the fly"? #4341

Open nmapx opened 1 week ago

nmapx commented 1 week ago

Clear and concise description of the problem

I'm currently moving from Drone to Woodpecker and while discovering differences I've noticed there is no such thing as conversion extension here. I guess it was introduced into Drone when Woodpecker was already there but I'm not sure. Maybe you could clarify more, or if it was removed why so? In my opinion it's a very useful and specific feature that allows more than just static pipelines. I'm talking here about "paved roads" / "golden paths" approach which allows creating complex yet very easy to maintain pipelines.

Suggested solution

Well, maybe it doesn't need to be conversion extension thing just like it is in Drone but something that would allow modifying pipeline syntax on the fly. This way we could easily introduce support for other languages (Starlark & Jsonnet) - it would eventually be converted into Yaml using this approach.

Alternative

No response

Additional context

No response

Validations

zc-devs commented 1 week ago

https://woodpecker-ci.org/docs/administration/advanced/external-configuration-api https://github.com/woodpecker-ci/woodpecker/pull/1396 https://github.com/woodpecker-ci/woodpecker/discussions/3277 https://github.com/woodpecker-ci/woodpecker/discussions/2254

nmapx commented 1 week ago

@zc-devs Thanks for the links, it clarified a lot! External configuration API seems to fit the problem, but only partially. If I get it correctly it's doing some pre-processing but still sending yaml syntax to the external API. There is no way to initialize the repository with pipeline in different syntax than yaml. I get the point that Woodpecker works on yaml files at the end, but for the pre-processing (in order to make sense of it) the source workflow should be in different lang like jsonnet or starlark (second in my case but it doesn't matter). There is Jsonnet addon mentioned in one of the issues but I wouldn't consider it stable - it seems it doesn't work for private repositories (which is usually the case I think) and UX of it is just... "not good enough"?

zc-devs commented 1 week ago

Before the run or restart of any pipeline Woodpecker will make a POST request to an external HTTP API sending the current repository, build information and all current config files retrieved from the repository

still sending yaml syntax to the external API. There is no way to initialize the repository with pipeline in different syntax than yaml

I'm not a ~real welder~ user of External API :) If I remember correctly, WP indeed looks up only for yaml and yml config files in a repository. But I don't know whether it reads/parses them or not. You can try to save jsonnet content in a file with yaml extension πŸ˜† I hope WP will send them as is to the external service. Then there you could process it, finally convert into WP YAML syntax and send back.

nmapx commented 1 week ago

Besides the fact it sucks and screws up IDEs it doesn't work (luckily) πŸ˜„ I've tested that quickly and Woodpecker is trying to parse the file(s) before sending them further.

zc-devs commented 1 week ago

WP indeed looks up only for yaml and yml config files

server/services/config/forge.go:116

trying to parse the file(s) before sending them further

server/services/setup.go:83 server/services/config/combined.go:36 server/pipeline/create.go:81 server/pipeline/create.go:94

will make a POST request to an external HTTP API sending ... all current config files retrieved from the repository

Didn't find that. Seems, it only fetches configs from external service in additional to configs from a forge.

Edit OK, technically, there are config files in request

Fetch(ctx context.Context, forge forge.Forge, user *model.User, repo *model.Repo, pipeline *model.Pipeline, oldConfigData []*types.FileMeta, restart bool) (configData []*types.FileMeta, err error)

but they (oldConfigData) are nill at pipeline creation

forgeYamlConfigs, configFetchErr := configService.Fetch(ctx, _forge, repoUser, repo, pipeline, nil, false)
zc-devs commented 1 week ago

https://github.com/woodpecker-ci/example-config-service

Archived Please checkout the new extensions sample which includes this config extension here

@woodpecker-ci/maintainers

  1. Does it suppose to be working at all?
  2. Shouldn't https://woodpecker-ci.org/docs/administration/advanced/external-configuration-api be removed or reworked?

@nmapx, you cat try to make an addon. There is config extension in the example. But I don't promise you anything πŸ˜„

qwerty287 commented 1 week ago

Does it suppose to be working at all?

It should, with the archived repo. However, that feature is currently reworked in https://github.com/woodpecker-ci/woodpecker/pull/3349. Thus we have the new repo, but this is not merged/ready yet. Docs will be updated together with the pr.

nmapx commented 1 week ago

@zc-devs Thanks for all that effort you put to understand my issue, really appreciate that! I think I will create plugin/addon of some sort in the future but since I got no time now I was looking for some other approach in the meantime. And... I found one. Maybe it's not exactly the same thing but I can achieve similar stuff this way also and the UX won't suck that much.

Let me describe it briefly. As a reminder there are basically 2 main requirements:

  1. ability to reuse parts of pipelines (each partial stored in a separate file in a complex directory structure) within the organization but with the possibility to parametrize each one
  2. ability to do some simple programming (I mean ifs/loops etc. - simple stuff but more that Yaml string built-in functions has to offer, less than a full scale application requires)

First one I will be able to achieve using Pythons' Jinja / or Mustache + mustache-cli or even pure yq commands or kustomize from Kubernetes - all of them provide Yaml templating this way or another. The downside is that instead of "on the fly" conversion it will need to be done on a git hook (pre-commit or pre-push whatever) by the developer. This way in the repo there will always be a ready to use pipeline(s).

Second one is more tricky since I need real programming within the pipeline runtime. Luckily I have an ability to run a shell binary that will work like a proxy for what I want to do. Example? Slack notification with message that I want to create "on the fly" depending on the runtime context will be build and sent by eg. Go application running from bash instead of workflow plugin. Of course it's like adding another layer which I'm not a big fan of but in this situation there are some pros too - I'm handling with real programming there, not some sketchy Yaml syntax (Starlark or Jsonnet are better but still comparing to Go, Rust or any other serious player... they all suck eventually πŸ˜„)

@qwerty287 Well this is very usefull if you indeed have a need for extension API. But I don't think it will solve this particual case. As I mentioned earlier - Woodpecker would need to allow workflows in other langs as well to make it a powerfull feature.

anbraten commented 1 week ago

I've tested that quickly and Woodpecker is trying to parse the file(s) before sending them further.

I actually thought that was working. At least when I tested it months ago I was setting the config path to sth like compile.lua and the external config http service was able use that file to return a generated yaml file.

zc-devs commented 1 week ago

workflow plugin

Oh, I see now. They are ~sucks~ useless. Also you can't transfer some state from step to step.

Go application running from bash instead of workflow plugin

So, you run general step with shell. Me too. Also thought about Python, but stuck just to POSIX shell only, it is available in most containers.

zc-devs commented 1 week ago

I actually thought that was working but they (oldConfigData) are nill at pipeline creation

It was currentFileMeta

    FetchConfig(ctx context.Context, repo *model.Repo, pipeline *model.Pipeline, currentFileMeta []*forge_types.FileMeta, netrc *model.Netrc) (configData []*forge_types.FileMeta, useOld bool, err error)

before #915, if it helps.

nmapx commented 3 days ago

If I understood correctly it supposed to be working some time ago, right? And now it doesn't. If so then is it possible to make it work as it used to? Then I would create an API extension that will do all the magic :)