pulumi / pulumi-command

Apache License 2.0
57 stars 23 forks source link

[aws instance] When i set delete attribute I received error: Unable to set 'PULUMI_COMMAND_STDOUT' #430

Closed gawsoftpl closed 1 month ago

gawsoftpl commented 1 month ago

What happened?

Pulumi typescript

I try install backup script via ssh to my aws ec2 instance with Ubuntu

@pulumi/command: 0.10.0

  1. I set delete attribute to remote command
  2. I execute pulumi up everything works
  3. When I execute pulumi destroy I received exception:

error: Unable to set 'PULUMI_COMMAND_STDOUT'. This only works if your SSH server is configured to accept these variables via AcceptEnv. Alternatively, if a Bash-like shell runs the command on the remote host, you could prefix the command itself with the variables in the form 'VAR=value command' error: Unable to set 'PULUMI_COMMAND_STDERR'. This only works if your SSH server is configured to accept these variables via AcceptEnv. Alternatively, if a Bash-like shell runs the command on the remote host, you could prefix the command itself with the variables in the form 'VAR=value command' error: EOF: running "/bin/sh ./backup-delete.sh":

Example

 const result = new command.remote.Command(
          `init-backup-ssh`,
          {
            connection: ...,
            create: '/bin/sh ./backup-init.sh',
            "delete": `/bin/sh ./backup-delete.sh`,
          },
        );

Output of pulumi about

CLI
Version 3.104.2 Go Version go1.21.6 Go Compiler gc

Plugins NAME VERSION command 0.10.0 command 0.9.2 kubernetes 4.8.0 nodejs unknown

Host
OS ubuntu Version 23.10 Arch x86_64

This project is written in nodejs: executable='/bin/node' version='v20.12.2

Additional context

No response

Contributing

Vote on this issue by adding a 👍 reaction. To contribute a fix for this issue, leave a comment (and link to your pull request, if you've opened one already).

gawsoftpl commented 1 month ago

I resolved it Issue was with apt update and apt install. I move below command to additional ssh command and everything works fine

sudo apt update && sudo apt install -y
mjeffryes commented 1 month ago

Glad you were able to figure it out @gawsoftpl !

gawsoftpl commented 1 month ago

But there is a solution for set create and delete command without setup AcceptVar in /etc/ssh/sshd_command ?

t0yv0 commented 1 month ago

I was running into this today. Unfortunately it looks like workarounds are fairly specific to the AMI and the Linux version you are using on your EC2 instance. For Amazon Linux, I was able to workaround by adjusting the SSH server configuration at startup like so:


name: imds-v2
runtime: yaml
description: Test the ability of pulumi-aws to authenticate on an EC2 instance with IMDSv2 enabled
config:
  pulumi:tags:
    value:
      pulumi:template: aws-yaml

variables:
  ec2ami:
    fn::invoke:
      function: aws:ec2:getAmi
      arguments:
        filters:
          - name: name
            values: ["amzn2-ami-hvm-*-x86_64-*"]
        owners:
          - amazon
        mostRecent: true
      return: id

resources:

  segroup:
    type: aws:ec2:SecurityGroup
    properties:
      ingress:
        - protocol: tcp
          fromPort: 80
          toPort: 80
          cidrBlocks: ["0.0.0.0/0"]
        - protocol: tcp
          fromPort: 22
          toPort: 22
          cidrBlocks: ["0.0.0.0/0"]

  priv-key:
    type: tls:PrivateKey
    properties:
      algorithm: RSA
      rsaBits: 2048

  key-pair:
    type: aws:ec2/keyPair:KeyPair
    properties:
      publicKey: ${priv-key.publicKeyOpenssh}

  inst:
    type: aws:ec2/instance:Instance
    properties:
      ami: ${ec2ami}
      instanceType: t2.micro
      keyName: ${key-pair.keyName}
      metadataOptions:
        httpTokens: required
        httpEndpoint: enabled
        httpPutResponseHopLimit: 1
      vpcSecurityGroupIds:
        - ${segroup}
      userData: |
        #!/bin/bash
        cat /etc/ssh/ssh_config >/tmp/sshd_config
        echo "AcceptEnv PULUMI_COMMAND_STDOUT" >> /tmp/sshd_config
        echo "AcceptEnv PULUMI_COMMAND_STDERR" >> /tmp/sshd_config
        sudo cp /tmp/sshd_config /etc/ssh/sshd_config || echo "FAILED to set sshd_config"
        rm /tmp/sshd_config
        sudo systemctl restart sshd.service

  init-cmd:
    type: command:remote:Command
    properties:
      create: pwd
      # SSH connection details to the remote machine
      connection:
        host: ${inst.publicIp}
        user: ec2-user # The default user for Amazon Linux AMI
        privateKey: ${priv-key.privateKeyOpenssh}

outputs:
  instanceId: ${inst.id}
  publicIp: ${inst.publicIp}
  commandOut: ${init-cmd.stdout}