Outbound AWS calls will favor AVAILABLE origin locations with best suited grid intensity levels based on info consumed via the GWF Carbon Aware SDK and/or API. Once the ideal target location is discovered the request to AWS will be rewritten in the background automatically. The implementation of the rewrite can be handled in several ways. In summary though the approach will be similar to load balancing. Except rather than picking the target based on traffic instead the target will be determined on carbon intensity. This will happen not at the server level but at the browser level using one or more of the following strategies.
Http Interceptor (Angular Only)
Worker
Intercept and rewrite in background web worker.
This will also need to work with separate infrastructures.
AWS native REST API
Custom APIs
AWS Serverless Functions / Api Gateway
Services running on ec2 instances
We don't use any but it's something to consider.
Services running Kubernetes
We don't use any but something to consider.
Current Services:
These are the current services already integrated without carbon awareness. Probably best to start with what we have.
s3
open search
vertigo
cognito
key spaces (cassandra)
s3
github
open search
S3
Operates globally already.
Open Search
running separate nodes in each region x az super expensive. Unless the entire cluster was moved around…
Vertigo
I think vertigo needs to be deployed to multiple region to facilitate this. The idea would be prior to making requests to vertigo detect carbon intensity levels of covered regions/zones choosing the one with the best balance of clean energy and proximity to the user. This concept can also be replicated for the native aws rest api.
This sort of fits hooks. Hooks could be used to augment the outbound request. Angular interceptor plugin could also be used.
It’s really not a plugin but an alter hook. Alters http request before sending. Very similar in concept to what us being done in Vulcan entity save scoutcard interceptor.
Planning
Prerequisites:
Flush out hook system
Hook not necessarily needed. Could be done with Angular interceptors instead. Could even be built in isolation of druid to start as a separate site/druid. That is worth considering to simplify things a bit.
Lowest carbon intensity algorithm
Might need to consider proximity to user.
Replicate vertigo across regions and possibly multiple a-z
Impact on our bill?
Implement extra AWS service that is region/a-z specific
This will be used as the second part proof of concept.
Open search is too expensive to do this.
Extract aws signing func out of open search and s3
Reduce duplication, increase sharing now that we are going to be adding new service integrations that need to share the signing algorithm.
We could make this the hook integration a spell instead of adding too core. Think on it.
Highest priority are replicating vertigo across regions/a-z and settings up a service or two to add for direct communication that sits in region/a-z. We also need to start flushing out the implementation details of the algorithm for lowest intensity discovery.
api: https://grnsft.org/hack22/api carbonhack22: https://taikai.network/en/gsf/hackathons/carbonhack22 Hooks: https://github.com/rollthecloudinc/druid/issues/12
Outbound AWS calls will favor AVAILABLE origin locations with best suited grid intensity levels based on info consumed via the GWF Carbon Aware SDK and/or API. Once the ideal target location is discovered the request to AWS will be rewritten in the background automatically. The implementation of the rewrite can be handled in several ways. In summary though the approach will be similar to load balancing. Except rather than picking the target based on traffic instead the target will be determined on carbon intensity. This will happen not at the server level but at the browser level using one or more of the following strategies.
This will also need to work with separate infrastructures.
Current Services:
These are the current services already integrated without carbon awareness. Probably best to start with what we have.
S3
Operates globally already.
Open Search
running separate nodes in each region x az super expensive. Unless the entire cluster was moved around…
Vertigo
I think vertigo needs to be deployed to multiple region to facilitate this. The idea would be prior to making requests to vertigo detect carbon intensity levels of covered regions/zones choosing the one with the best balance of clean energy and proximity to the user. This concept can also be replicated for the native aws rest api.
This sort of fits hooks. Hooks could be used to augment the outbound request. Angular interceptor plugin could also be used.
It’s really not a plugin but an alter hook. Alters http request before sending. Very similar in concept to what us being done in Vulcan entity save scoutcard interceptor.
Planning
Prerequisites:
We could make this the hook integration a spell instead of adding too core. Think on it.
Highest priority are replicating vertigo across regions/a-z and settings up a service or two to add for direct communication that sits in region/a-z. We also need to start flushing out the implementation details of the algorithm for lowest intensity discovery.
Was leaning toward https://www.serverless.com/plugins/serverless-multi-region-plugin but