cloudydeno / deno-aws_api

From-scratch Typescript client for accessing AWS APIs
https://deno.land/x/aws_api
59 stars 3 forks source link

AWS Transfer / Future of this module? #48

Closed metarama closed 2 months ago

metarama commented 3 months ago

I would like to add AWS Transfer Family as a service. How do I go about doing this? How does 'codegen' work?

My immediate goal is to use an SFTP Connector to bring csv files into Supabase storage bucket using the S3 protocol.

Also, what is the general state of this module? Are we keeping in sync with the latest aws-sdk? Or have we abandoned this module as Deno supports more and more of Node and we use the aws-sdk v3 directly?

danopia commented 3 months ago

Hello,

The codegen is hosted at https://aws-api.deno.dev/

The latest codegen version v0.4 uses aws sdk definitions from v2.1323.0. If you would like to use a newer version, this can be specified in the URL, but I haven't verified the codegen compatibility with newer versions. For example: https://aws-api.deno.dev/v0.4/sdk@v2.1625.0/

Is this the AWS Transfer Family you are looking for? https://aws-api.deno.dev/v0.4/services/transfer.ts

The generated method list can be viewed here: https://doc.deno.land/https://aws-api.deno.dev/v0.4/services/transfer.ts%3Fdocs=full/~/Transfer

I intend to keep this module maintained for the foreseeable future. I don't intend personally to switch my projects to aws-sdk v3 due to its size and module count.

Apologies for being terse, I wanted to get a quick response back before going into my working day

metarama commented 3 months ago

Thank you @danopia.

Question: Why is AWS Transfer service not being generated in the list at https://deno.land/x/aws_api@v0.8.1/services?doc= ?

Question: How do I initialize ApiFactory and/or the Transfer commands to use Supabase storage bucket as an S3 bucket as Supabase supports the S3 protocol?

danopia commented 3 months ago

Question: Why is AWS Transfer service not being generated in the list at https://deno.land/x/aws_api@v0.8.1/services?doc= ?

The services/ folder in /x/aws_api contains only a very small hand-picked set of key APIs. I removed the vast majority of the APIs from there when I first set up https://aws-api.deno.dev because I didn't want to upload an excessive amount of pre-generated files to /x/aws_api. So https://aws-api.deno.dev is how you can obtain files for a particular API, and if you are worried about long-term stability, feel free to download the generated files and store them in your project repository.

BTW, I see that a lot of SFTP methods were added to the transfer API since the last time I bumped the default aws-sdk-js version. So this would be the URL you can import to get the latest transfer API surface as of today: https://aws-api.deno.dev/v0.4/sdk@v2.1625.0/transfer.ts

Question: How do I initialize ApiFactory and/or the Transfer commands to use Supabase storage bucket as an S3 bucket as Supabase supports the S3 protocol?

I wrote up a page for S3 Compatible Vendors with examples for a few other vendors. I don't have an example for supabase, but if you have success using it then I believe the wiki is updatable..

Your ApiFactory setup may look like this:

// uses variables:
// SUPABASE_PROJECT
// SUPABASE_REGION
// SUPABASE_ACCESS_KEY_ID
// SUPABASE_SECRET_ACCESS_KEY
// (SUPABASE_SESSION_TOKEN optionally)
const projectId = Deno.env.get('SUPABASE_PROJECT');
const api = new ApiFactory({
  fixedEndpoint: `https://${projectId}.supabase.co/storage/v1/s3`,
  credentialProvider: new EnvironmentCredentials('SUPABASE'),
  region: Deno.env.get("SUPABASE_REGION"),
}).makeNew(S3);

This is the first third-party I've looked at which includes a path prefix in the endpoint, so this might not be respected by my code. If you experience errors then please file them as a new issue :)

Oh and if aws-sdk V3 fits your needs then maybe you want to go to the official SDK. It depends on your priorities of course.

metarama commented 3 months ago

Thank you @danopia.

By default AWS Transfer uses the AWS S3 for outgoing or incoming file transfers. I wanted to modify this default behavior such that the SFTP Connector lives in AWS but writes to Supabase S3 Storage. So there are two sets of credentials - one for AWS and the other for Supabase S3.

Using the ApiFactory I want to create Transfer api that uses the AWS credentials to launch the SFTP Connector. But when this SFTP Connector invokes S3 I want it to use Supabase S3 that needs different access key and secret provided by Supabase.

Do you think this is possible with aws_api?

danopia commented 3 months ago

I wanted to modify this default behavior such that the SFTP Connector lives in AWS but writes to Supabase S3 Storage. S

Ok so, I doubt that a managed AWS product would support using a third-party S3 endpoint. AWS very much likes keeping control of the involved services, for various reasons including stability and security. For example, AWS Transfer doesn't accept a static credential to access your Amazon S3, instead depending fully on IAM Roles.

If you would like to have your incoming files in Supabase storage, you could look at setting up a lambda function in AWS that copies the files over as they are received, for example. This may or may not be easier than only using Amazon S3 for your SFTP needs.

This /x/aws_api library can handle multiple credentials - via making multiple ApiFactory instances - but I don't think it'll let you achieve the cross-cloud setup you're looking for. I could be wrong since I'm not familiar with AWS Transfer but I would recommend reading the API or service documentation for AWS Transfer to understand what actions are available to you.

Let me know if you have further questions about using this libary