grpc / grpc-node

gRPC for Node.js
https://grpc.io
Apache License 2.0
4.48k stars 648 forks source link

Since version 1.10.9 grpc-node doesnt comply with grpc spec in case of non 200 response #2822

Open RemiBou opened 1 month ago

RemiBou commented 1 month ago

Problem description

When server returns HTTP 404 with "404 not found" in body grpc-node throws a RESOURCE_EXHAUSTED

Reproduction steps

Init your grpc client with a non grpc server, it should throw "UNIMPLEMENTED" error, instead it might throw "RESOURCE_EXHAUSTED"

Environment

Additional context

I think it's same issue as https://github.com/grpc/grpc-node/issues/2809 I want to add that this should be fixed in grpc-node as it doesn't comply with the grpc spec anymore. Reference https://github.com/grpc/grpc/blob/master/doc/PROTOCOL-HTTP2.md

Implementations should expect broken deployments to send non-200 HTTP status codes in responses as well as a variety of non-GRPC content-types and to omit Status & Status-Message. Implementations must synthesize a Status & Status-Message to propagate to the application layer when this occurs.

When "404" is return by server then grpc status 12 and error UNIMPLEMENTED should be send to the client application.

Java and GO grpc library work that way and grpc-node used to work that way until this commit https://github.com/grpc/grpc-node/commit/674f4e351a619fd4532f84ae6dff96b8ee4e1ed3

murgatroid99 commented 1 month ago

OK, I see the problem: the RESOURCE_EXHAUSTED error from attempting to parse the response body is superseding the UNIMPLEMENTED error from the HTTP status.

RemiBou commented 1 month ago

@murgatroid99 do you think it should be fixed ?

murgatroid99 commented 1 month ago

Nothing has happened with this yet.

erlendnils1 commented 3 weeks ago

We are experiencing a similar issue. A HTTP 502 error from the load balancer used to throw an UNAVAILABLE error, but now throws RESOURCE_EXHAUSTED "Received message larger than max" based on misinterpreting the response as a gRPC response.

tysseng commented 3 weeks ago

For others experiencing the same, the reported message body sizes in the case mentioned by @erlendnils1 are:

1633951815 = 0x61 0x64 0x20 0x47 or 'ad g', meaning the client received 'bad gateway' 1013478509 = 0x3C 0x68 0x74 0x6D or '<htm', when the client receives an error page from the GKE (Kubernetes) load balancer, which returns a message saying 'The server encountered a temporary error and could not complete your request. Please try again in 30 seconds.'