Open cynicaljoy opened 9 months ago
Hi @cynicaljoy. I am unable to reproduce the bug. Your script was very helpful getting everything setup. I was able to import and then run pulumi pre --diff
and see no changes. Which version of nomad are you running against? I am on v1.6.2
.
P.S. The run script has a typo:
-export NOAD_ADDRESS="http://127.0.0.1:4646/"
+export NOMAD_ADDRESS="http://127.0.0.1:4646/"
Hi @cynicaljoy. I am unable to reproduce the bug. Your script was very helpful getting everything setup. I was able to import and then run
pulumi pre --diff
and see no changes. Which version of nomad are you running against? I am onv1.6.2
.P.S. The run script has a typo:
-export NOAD_ADDRESS="http://127.0.0.1:4646/" +export NOMAD_ADDRESS="http://127.0.0.1:4646/"
Ah, sorry about that! I'm using Nomad 1.5.3
I just went through the same steps with Nomad 1.6.2 and repo'd 😕
I recreated it with Docker too:
Pretty much the same steps, but a few extra since I didn't take the time to automate everything out and the nomad image doesn't have jq
in it.
Same result as before:
Thanks again @cynicaljoy docker-compose is handy we probably should set this up for running examples locally. Looks like something nefarious is going on with "id" which is a special property, not a regular property. My team will get back to you debugging this as time permits.
PULUMI_DEBUG_GRPC="$PWD/here.json" logs could be useful from the import, as well as checking up on pulumi state export
to check how this resource writes itself into the statefile.
Unfortunately, it looks like this issue hasn't seen any updates in a while. If you're still encountering this problem, could you leave a quick comment to let us know so we can prioritize it? (Commenting will bump it back into our triage queue.)
@mjeffryes we're still waiting on a fix/workaround for the issue.
Thank you for verifying. We will take a look as soon as we can.
What happened?
After importing my Nomad AclPolicy into my stack and running a
pulumi pre --diff
I noticed it was ending up in a state where it was going to bereplaced
. I sandbox this out in a local environment and ran thepulumi up
and it doesn't just replace the Acl it actually destroyed it. Luckily running a pulumi refresh and pulumi up recreates it and the tokens associated are still in-tact. But, it shouldn't be necessary to jump through those hoops. I'd expect an imported AclPolicy to have no diffs.I tried to add
ignore_changes=["id"]
to the resource options, that didn't help.Example
full-access.hcl
``` namespace "*" { policy = "write" capabilities = ["alloc-node-exec"] } agent { policy = "write" } operator { policy = "write" } quota { policy = "write" } node { policy = "write" } host_volume "*" { policy = "write" } ```Output of
pulumi about
CLI Version 3.103.1 Go Version go1.21.6 Go Compiler gc
Plugins NAME VERSION aws 6.19.0 nomad 2.1.0 python unknown
Host OS darwin Version 14.2.1 Arch x86_64
This project is written in python: executable='/usr/local/share/mise/installs/python/3.9/bin/python3' version='3.9.18'
Current Stack: nomad-acls-sandbox/local
TYPE URN pulumi:pulumi:Stack urn:pulumi:local::nomad-acls-sandbox::pulumi:pulumi:Stack::nomad-acls-sandbox-local pulumi:providers:nomad urn:pulumi:local::nomad-acls-sandbox::pulumi:providers:nomad::default_2_1_0 nomad:index/aclPolicy:AclPolicy urn:pulumi:local::nomad-acls-sandbox::nomad:index/aclPolicy:AclPolicy::full-access-policy
Found no pending operations associated with local
Backend
Dependencies: NAME VERSION pip 23.3.2 pulumi_aws 6.19.0 pulumi_nomad 2.1.0 setuptools 69.0.3 wheel 0.42.0
Additional context
No response
Contributing
Vote on this issue by adding a 👍 reaction. To contribute a fix for this issue, leave a comment (and link to your pull request, if you've opened one already).