StackStorm-Exchange / stackstorm-aws

st2 content pack containing Amazon Web Services integrations.
https://exchange.stackstorm.org/
Apache License 2.0
15 stars 26 forks source link

Discussion on the future of the AWS pack #55

Open warrenvw opened 7 years ago

warrenvw commented 7 years ago

Overview

This issue exists to provide a forum for discussion. We need to come to consensus on how to move forward with the AWS pack. Make your opinion known below, or forever hold your peace. :)

NOTE: Please add new comments instead of editing this one. Let me know in a comment if you'd like me to make any changes here.

Current Problems

Today, the AWS pack has a few pain points:

Reference:

Solutions

As a community, we need to decide which of the following options provides the best possible resolution to the above pain points. Are there any other options?

There is a desire to have only one long lived branch within any pack.

Option 1: One pack with all actions

This is what we have today on master branch.

Pros: There's one AWS pack. Cons: No one uses even close to all the 3581 available actions. Lots of bloat.

Stackstorm currently has performance issue related to installation/registration of packs with a large number of actions.

Option 2: One pack per AWS service

Pros: Each pack contains all actions for the service. Users only install the packs they need. Each pack could have its own icon. Cons: There are currently 111 AWS services. There are ~130 existing packs in Stackstorm-Exchange. This means almost half of all packs will be related to AWS.

Perhaps introduce the concept of "sub pack" if it doesn't exist already. Even for smaller packs, it could be really nice for organization and even granting permissions.

The action generator st2packgen.py and the lib/ folder will need to be shared with all AWS packs (or sub packs). Make stackstorm-aws available in pypi and include it in all AWS packs.

We can ensure documentation describes how to auto-generate new actions when they're made available by boto. This way we can install the common actions in the back, and if people need any missing action, they can generate on their own using provided tooling (st2packgen.py).

Option 3: One pack with a small number of generic actions

Pros: All boto3 actions are available as soon as they're published by AWS; we don't need to manually generate new actions manually when they're added. Provides ability to assume roles and switch regions. See https://github.com/StackStorm-Exchange/stackstorm-aws/pull/44 Cons: Cannot filter based on action, and can't see pack metadata/expected input.

warrenvw commented 7 years ago

We may decide that only the more common actions should be provided by the each pack (or sub pack). If we say that each pack contains all actions, then it could easily be assumed that we actively ensure all actions provided by boto3 exist in the pack.

AndyMoore commented 7 years ago

I think option 2 is the best one here. In terms of AWS services I'd expect the majority to not use more than 3-4, and so they'd only have to install the packs that correspond to their needs. As long as the pack management interface is well defined it should be trivial to find the pack you require, so number of packs shouldn't be an issue.

In my opinion option 3 is a non-starter. The way we (at Pearson) work is to build up a library of packs/actions, with the aim of allowing engineers to build workflows to suit their own requirements. This means them being aware of what's available in a single place (the st2 web/cli interface).

One of the main plus points of stackstorm is being able to combine discrete actions (that work independently) from several different packs (aws, kubernetes, consul, vault) into a coherent workflow. If we're going to have 10 aws.run_action actions inside a workflow, we might as well go back to a monolithic boto3 python script. An action should have a proper, discrete and defined purpose (i.e. doesn't enable another action, and shouldn't be generic) and be well documented so that anyone else can pick it up and understand it

As an aside, as well as the st2 performance issues, the current situation is also an issue in GitHub when searching through actions. It's just a bit unwieldy as it is currently. (Sorry, we had to truncate this directory to 1,000 files. 2,586 entries were omitted from the list.)

mickmcgrath13 commented 7 years ago

This could accommodate option 2: https://github.com/StackStorm/st2/issues/3698#issuecomment-324449525

.because you could essentially create your own pack, and name your dependencies

Kami commented 7 years ago

I'm also for option 2) - it makes the most sense.

If needed, we can also split common functionality used across services and packs into a common Python library which we publish on PyPi so we can re-use code.

lakshmi-kannan commented 7 years ago

I am also for option (2). We just have to make sure we name the packs appropriately. As long as we agree on the pack names and each pack is well documented, we should be good. The problem I see is when we have something like boto4. We might have to edit these packs individually to upgrade them.

Wrt pack names, I think something like aws_s3, awsec2 can work but is kinda hard to type and an eyesore.

AndyMoore commented 7 years ago

the packs will be generated, whether its boto2, boto3 or boto35 :)

my thinking for pack names was aws_ - i agree its not pretty, but is there any automation that uses hyphens to pull out pack names?

as per @Kami 's point i think somewhere like PyPi would work, or an extra library hosted within git. as long as it can be included in requirements.txt it should be sufficient

Susanthab commented 6 years ago

I just want to share my thought on this as I’ve been using this approach extensively so I’m experiencing the benefits.

I like option #3 due to the reasons mentioned below; • It is so simple to use. This pack needs only two actions, 1) assume role, 2) boto3action to interact with boto3. • Developer can make use of detailed boto3 documentation about each action. I do not think actions’ metadata could provide that much detailed documentation. Even when I use the aws regular pack, I had to use boto3 documentation very often. • There is no requirement of pack generation whenever boto3 add/modify features. So, pack maintenance need minimum or zero effort. • Avoid bugs injected by the pack generation. • Since there are only two actions, st2 needs to load only those two actions thus it is very lightweight for the st2 server. • Any software piece designed with an eye toward future is very important. In that aspect, option #3 wins in my opinion because pack is protected from the changes in boto3.

I can understand this is very radical approach when it compares to the other st2 packs. But ultimately option #3 delivers extremely powerful way to interact with AWS via boto3.

Currently this pack is available in boto3 branch and you can install it like below; st2 pack install aws=boto3

So, I’d like to suggest, to continue this pack as an option so that a user has the choice to select which pack they like.

xydinesh commented 6 years ago

Disclaimer: I came up with option3, due to limitations in current (generated) aws pack.

I personally like option 3, when I was using aws pack I didn't rely on filtering or action meta data from stackstorm. I was using boto3 prior to st2 and comfortable with boto3 documentation. Also, I like how option 3 let user to depend on boto3 configurations without having separate st2 configuration for aws pack. As a python developer, I felt option 3 is the most pythonic way to write aws pack.

IMO, option 2 will be a maintenance nightmare. Will likely to cause confusion on the users end to figure out what pack to use.

We don't use stackstorm on my current day job (MapAnything), however if I'm using aws pack I will be using with option 3 as it doesn't require lot of changes to access boto3 actions from st2.

SinisterMinister commented 6 years ago

@AndyMoore

One of the main plus points of stackstorm is being able to combine discrete actions (that work independently) from several different packs (aws, kubernetes, consul, vault) into a coherent workflow.

Option 3 does not preclude this ability. For example:

      create_vpc:
        action: aws.boto3action
        input:
          service: ec2
          action_name: create_vpc
          region: <% $.vpc_config.region %>
          params: <% dict(CidrBlock => $.vpc_config.cidr, InstanceTenancy => $.vpc_config.instance_tenancy) %>
          credentials: <% $.credentials %>

        publish:
          vpc_id: <% task(create_vpc).result.result.Vpc.VpcId %>
          subnets: <% $.vpc_config.subnets.values() %>
          eips: <% $.vpc_config.eips.values() %>
          vpc_tags: <% $.vpc_config.tags %>
          region: <% $.vpc_config.region %>
          status_message: "create_vpc"
        on-success:
          - create_vpc_tags
          - create_subnets
          - create_igw
          - save_vpc_metadata
          - save_vpc_region_metadata
          - save_vpc_name_metadata

      save_vpc_metadata:
        action: consul.put
        input:
          key: <% $.consul_endpoint %>state/vpc/id
          value: <% $.vpc_id %>

      save_vpc_name_metadata:
        action: consul.put
        input:
          key: <% $.consul_endpoint %>state/vpc/name
          value: <% $.vpc_config.name %>

      save_vpc_region_metadata:
        action: consul.put
        input:
          key: <% $.consul_endpoint %>state/vpc/region
          value: <% $.region %>

If we're going to have 10 aws.run_action actions inside a workflow, we might as well go back to a monolithic boto3 python script.

That snippet above comes from a workflow with 11 out of the 31 calls to boto3action and would be silly to write it in a python script.

An action should have a proper, discrete and defined purpose

The boto3action does have a "proper, discrete and defined purpose" in that it is to execute boto3 commands directly.

...and be well documented so that anyone else can pick it up and understand it

Considering that all of us using it had never touched Stackstorm before this project, I feel like that the pack satisfies this goal.

@Susanthab

Developer can make use of detailed boto3 documentation about each action. I do not think actions’ metadata could provide that much detailed documentation. Even when I use the aws regular pack, I had to use boto3 documentation very often.

I think this is the main issue I had with the approach of options 1 and 2. If your knowledge of AWS is limited, you're gonna live in the AWS API docs while implementing your feature and I'm not sure it should be ST2's job to emulate that with action documentation. Translating the API calls into the boto3action is straight forward, but the packs were unwieldy to use and find the commands I needed, plus figuring out if it was a boto2 vs boto3 action didn't help.

Since there are only two actions, st2 needs to load only those two actions thus it is very lightweight for the st2 server.

This was also quite frustrating for me as it would take our action page 20s to load when using the AWS pack. Made my dev process quite painful.

@xydinesh

Also, I like how option 3 let user to depend on boto3 configurations without having separate st2 configuration for aws pack

This was another key feature to the boto3action in that it would work using the default AWS authN workflows, allowing us to use EC2 roles so that Stackstorm could directly interface with the AWS metadata service when using the pack. Furthermore, it allowed for more complex role models where multiple assumption calls are made to get the correct AWS API access.

IMO, option 2 will be a maintenance nightmare.

I do agree with the maintenance nightmare part. One of the issues we found with option 1 was some of the commands weren't generated. That said, I'm not gonna be the guy maintaining it, so if someone doesn't mind supporting it, have at it.

Will likely to cause confusion on the users end to figure out what pack to use.

I could see this potentially being an issue, but the upside of it is that we're not pulling in the APIs for Mechanical Turk or some of the business tooling when all we're using is the EC2 API. Yes, option 3 covers this quite well, but if ST2 wants to keep things idiomatic to ST2, it's a much better approach IMO than option 1.

Personally, I see no reason why we can't have both as they each have their merits. Perhaps option 2 allows for a small pack that lets us make raw boto3 calls? Then we can have the best of both worlds!

jjm commented 6 years ago

I've started looking into automating AWS via ST2 and the lack of an switch role action in the master branch makes the aws pack not really useful for my use case.

I think Option 3 seems the best, the number of actions in the AWS pack currently does make the loading packs really slow.

How about we have a separate aws_boto3 pack with only has the required actions and along with some simple workflows as a step forward. ThIs branch (and this blog post) were a little hard to find and this may allow the idea to progress some and prevent any further confusion.

LindsayHill commented 6 years ago

@jjm yeah, I think you're right - have a separate aws_boto3 pack. It is a bit hidden right now. I can create the repo, if you can then create a PR with the content you want in it?

jjm commented 6 years ago

@LindsayHill Sure, create the repo and I'll get the needed files out of this pack and into the new pack.

LindsayHill commented 6 years ago

Cool - repo created

jjm commented 6 years ago

@LindsayHill Thanks, I've started on the first PR for the new pack. I expect I'll get it finished min-week. As I'm rather busy tomorrow.

PLPRASU commented 2 years ago

hi, this is prasanna, whenever the instance stoped then automatically start the instance, for this set an cloud watch alaram and a send a notification via SNS whenever the instance is in stoped, how to achive this using stackstorm and using aws pack??