Closed yishait-tailor closed 1 week ago
Hey @yishait-tailor. Thanks for reporting this bug. The provider definitely shouldn't use that much memory. Just to clarify, what do you mean by the "first run"?
pulumi up
uses 10gbpulumi up
in each container takes 10gb?Can you check how much memory go build
uses to build your Pulumi program?
Hey @iwahbe! thanks for the quick response,
by the looks of it, running go build
is what causes this memory consumption
first run = if go build
has not run yet on the container it will cause the issue,
Are you saying that when you create a new stack with datadog, the first pulumi up uses 10gb
go build
in the bg, it causes the issue,Are you saying that the initial run of pulumi up in each container takes 10gb?
the workaround is to create the docker image with the go binary file produced by running go build
and changing the Pulumi.yaml to use the binary,
we would prefer to avoid doing that...
runtime:
name: go
options:
binary: go-binary-file
Thanks!
Hey @yishait-tailor. That clarifies the problem nicely. The problem is that the go SDK for pulumi-datadog is simply too big.
I'll take a look at reducing the number of types we export.
This is related to https://github.com/pulumi/pulumi-terraform-bridge/issues/1468.
Describe what happened
Running
pulumi preview
on any resource in datadog SDK with V4 consumes more than 10GB of memory on first run V3 consumes more than 4GB of memoryusing:
github.com/pulumi/pulumi-datadog/sdk/v4/go/datadog
github.com/pulumi/pulumi-datadog/sdk/v3/go/datadog
using the examples only with no changes tested on several platforms all same behavior using s3 as backend without pulumi account
Is there any way to avoid this?
Sample program
Log output
Affected Resource(s)
every initial
pulumi preview
runOutput of
pulumi about
Additional context
Im trying to use this SDK in CI on ephemeral containers, and i don't want to allocate such a high amount of memory just for the first run.
This happens with all Datadog SDK resources And not on other pulumi SDKs
Contributing
Vote on this issue by adding a 👍 reaction. To contribute a fix for this issue, leave a comment (and link to your pull request, if you've opened one already).