jaydenseric / graphql-upload

Middleware and an Upload scalar to add support for GraphQL multipart requests (file uploads via queries and mutations) to various Node.js GraphQL servers.
https://npm.im/graphql-upload
MIT License
1.42k stars 131 forks source link

Support Google Cloud Functions #129

Open gcoda opened 5 years ago

gcoda commented 5 years ago

Solution from koa-busboy works on GCF with express should i make a pull request out of this?

jaydenseric commented 5 years ago

Thanks for the suggestion! I'm not very familiar with Google Cloud Functions so I'm not in a great position to review the change.

@mike-marcacci any thoughts?

mike-marcacci commented 5 years ago

Hmmm, the workaround used in koa-busboy here is highly suboptimal: not only is it unrealistic to assume that the entire payload will always fit into memory, but by waiting for the entire request up front, we lose the ability to run resolvers concurrently with the request.

There may be a valid reason for this given the goals and architecture of Google Cloud Functions, but if this is the case, the workaround almost certainly belongs in their node package, to keep it consistent with other node libraries that expect a standard HTTPRequest object.

According to this issue this should already be the case. If it isn’t, though, a PR that writes the contents of rawBody to the request stream should be made there instead.

mike-marcacci commented 5 years ago

@gcoda just to make sure I’m understanding this correctly, this is to support the emulator, but production cloud functions work; is that correct?

According to Google’s docs on multipart requests this should be working the way it is.

gcoda commented 5 years ago

@mike-marcacci i am using it on live function for 5 days now

You are correct, sometimes it will not fit, and there is no way of making streams from requests, at least i did not found one yet, except of uploading to storage and using "finalize" trigger like they recommend in some tutorials

gcoda commented 5 years ago

Lambda has event.body and it looks like it needs to be handled same way as google`s rawBody.

so, i need a middleware that makes request a readableStream out of rawBody or amazon`s event.body?

const { Readable } = require('stream')

app.use((req, res, next) => {
  if (req.rawBody) {
    const readable = new Readable()
    readable._read = () => {}
    readable.push(req.rawBody)
    readable.push(null)
    Object.assign(req, readable)
  }
  next()
})
mike-marcacci commented 5 years ago

@gcoda - that definitely looks like the "correct" solution... however, I do want to note that this library is highly optimized for the stream case, and uses the filesystem to buffer large payloads.

According to a comment in this code sample, the directory identified by os.tmpdir() is actually an in-memory filesystem in Google Cloud Functions. This means that we will essentially double the memory usage, since it already exists in its entirety in-memory.

This library should work correctly with the middleware you wrote above, but optimizing this use case would really require a re-architecture, and probably belongs best in a separate api-compatible package. While I don't need this personally (all my production uses of this run on k8s clusters), I would be more than happy to help should someone else champion such a project.

koresar commented 4 years ago

Hi @gcoda

I've recently forked this awesome package and added GCF support (taking your code as example). I need someone to test it though. Is there anyone out there who'd like to report if this works?

https://www.npmjs.com/package/graphql-upload-minimal/v/1.1.0-0#google-cloud-functions-gcf

TODO:

Thanks!

karladler commented 4 years ago

looks great, we currently try to get it to work with Azure functions.