fsouza / fake-gcs-server

Google Cloud Storage emulator & testing library.
https://pkg.go.dev/github.com/fsouza/fake-gcs-server/fakestorage?tab=doc
BSD 2-Clause "Simplified" License
1.04k stars 209 forks source link

Resumable upload fails with 400 using nodejs client. #346

Open jeantil opened 3 years ago

jeantil commented 3 years ago

While working on #345 I was unable to make the default file.save(...) call work and was forced to add { resumable: false } to make it pass.

The error was:

Error: Upload failed
    at Upload.<anonymous> (/home/jean/dev/startups/yupwego/src/fake-gcs-server/examples/node/node_modules/gcs-resumable-upload/build/src/index.js:191:30)
    at Upload.emit (events.js:327:22)
    at Upload.onResponse (/home/jean/dev/startups/yupwego/src/fake-gcs-server/examples/node/node_modules/gcs-resumable-upload/build/src/index.js:388:14)
    at Upload.makeRequestStream (/home/jean/dev/startups/yupwego/src/fake-gcs-server/examples/node/node_modules/gcs-resumable-upload/build/src/index.js:337:14)
    at processTicksAndRejections (internal/process/task_queues.js:97:5)
    at async Upload.startUploading (/home/jean/dev/startups/yupwego/src/fake-gcs-server/examples/node/node_modules/gcs-resumable-upload/build/src/index.js:219:13)

Running the script in a debugger yields an HTTP 400 error on the following request

PUT https://[::]:4443/upload/resumable/c2bfc362c82655ce7c2c5074bd300d11
Content-Range: 'bytes 0-*/*', 
Authorization: 'Bearer ya29.c.KpMB4QfVy54ujAsnhGPSPjDDAGp…LPYC_aCWrzh57NuDr-91aMHax7j4d5nKTGdQFKC-T', User-Agent: 'google-api-nodejs-client/6.0.6', 
x-goog-api-client: 'gl-node/12.18.3 auth/6.0.6',
Accept: 'application/json'

result in

400 Bad Request
'invalid Content-Range: bytes 0-*/*\n'
gnarea commented 3 years ago

I ran into this issue, but instead of getting a 400, the file.save(...) just hung. I fixed it by passing resumable: false.

gnarea commented 3 years ago

@fsouza, #497 didn't fix this issue. It just made it different. Consider this:

test('Bug', async () => {
  const bucketName = 'foo';
  const fileContents = Buffer.from('bar');

  const client = new Storage({
    apiEndpoint: 'http://127.0.0.1:8080',
    projectId: 'the id',
  });

  await client.createBucket(bucketName);

  const file = client.bucket(bucketName).file('key');
  await file.save(fileContents);
});

file.save() now fails with a more cryptic error:

FetchError: request to http://[::]:8080/upload/resumable/b5236acc2a828b98e362e7b70ee2cdf6 failed, reason: connect ECONNREFUSED :::8080

Server logs:

gcs_1    | time="2021-07-09T15:09:22Z" level=info msg="couldn't load any objects or buckets from \"/data\", starting empty"
gcs_1    | time="2021-07-09T15:09:22Z" level=info msg="server started at http://[::]:8080"
gcs_1    | time="2021-07-09T15:09:28Z" level=info msg="172.24.0.1 - - [09/Jul/2021:15:09:28 +0000] \"POST /storage/v1/b?project=the%20id HTTP/1.1\" 200 110"

Adding resumable: false fixes the issue.

fsouza commented 3 years ago

Hm I wonder where it's trying to connect to. I'll investigate later, thanks for sharing a reproducer.

stoffeastrom commented 2 years ago

FYI I just got hit by this as well. Doing:

diff --git a/fakestorage/upload.go b/fakestorage/upload.go
index ab50369..6c99254 100644
--- a/fakestorage/upload.go
+++ b/fakestorage/upload.go
@@ -350,9 +350,9 @@ func (s *Server) resumableUpload(bucketName string, r *http.Request) jsonRespons
    }
    s.uploads.Store(uploadID, obj)
    header := make(http.Header)
-   header.Set("Location", s.URL()+"/upload/resumable/"+uploadID)
+   header.Set("Location", s.PublicURL()+"/upload/resumable/"+uploadID)
    if r.Header.Get("X-Goog-Upload-Command") == "start" {
-       header.Set("X-Goog-Upload-URL", s.URL()+"/upload/resumable/"+uploadID)
+       header.Set("X-Goog-Upload-URL", s.PublicURL()+"/upload/resumable/"+uploadID)
        header.Set("X-Goog-Upload-Status", "active")
    }
    return jsonResponse{

e.g changing s.URL() -> s.PublicURL() ensures getting the correct url. However after this I get a Error: Retry limit exceeded which I haven't investigated yet.

stoffeastrom commented 2 years ago

Just found the ExternalURL options.. so no need for above change :D

sergseven commented 2 years ago

Setting ExternalURL is a clue to have fake-gcs-server working for Testcontainer!

Just in case someone is looking for a solution for Testcontainers & Spring & JUnit 5:

  static final int fakeGcsMappedPort = SocketUtils.findAvailableTcpPort();

  @Container
 static final GenericContainer fakeGcs = new FixedHostPortGenericContainer<>("fsouza/fake-gcs-server")
      .withExposedPorts(4443)
      .withFixedExposedPort(fakeGcsMappedPort, 4443)
      .withCreateContainerCmdModifier(cmd -> cmd.withEntrypoint(
          "/bin/fake-gcs-server",
          "-scheme", "http",
          "-external-url", "http://0.0.0.0:" + fakeGcsMappedPort));

  @DynamicPropertySource
  static void gcs(DynamicPropertyRegistry registry) {
    registry.add("property.for.gcs.storage.host", () -> "http://0.0.0.0:" + fakeGcs.getFirstMappedPort());
  }

the difficulty with Testcontainers is that a random mapped port(external port) is available after the generic container is spinned up and running, so we can't know it before. So the trick is to get random port before the container is started(FixedHostPortGenericContainer used).

sergseven commented 2 years ago

As a follow up, it doesn't work on CI environments where docker containers run on a remote VM, so "container IP" is different than localhost(0.0.0.0) and is only known after container is eventually started.

A workaround for this case is to update external-url of a started fake-gcs-server container, that is proposed in #659.

@fsouza would be nice if you look into #659