vmagamedov / grpclib

Pure-Python gRPC implementation for asyncio
http://grpclib.readthedocs.io
BSD 3-Clause "New" or "Revised" License
936 stars 92 forks source link

Generated services are not backwards compatible #168

Open njooma opened 1 year ago

njooma commented 1 year ago

Additive only changes to protobuf files will result in generated code that is not backwards compatible.

If I add RPC methods to my protobuf files, compiling those protobuf files with grpclib will result in the abstract ...ServiceBase class adding abstract methods for those new RPC methods. Then, my concrete implementations of that service need to be updated with the new methods (even if just to raise a NotImplementedError) to avoid getting a TypeError for not implementing all methods of the abstract class.

This has become an issue because I'm implementing RPC methods defined by an upstream library so I can be compatible with the users of that library. However, I can't/choose not to always implement all the new methods as the upstream maintainers publish them -- I want to pick and choose. But as soon as I recompile the library's protobuf files with grpclib, I now have to update all my implemented services with the new methods.

To solve this, I propose updating the grpclib protoc plugin to generate an additional class Unimplemented...ServiceBase that is a concrete implementation of the abstract ...ServiceBase class, where all methods have default implementations to raise a NotImplementedError. This way, current users of grpclib who rely on receiving a TypeError to know when they haven't fulfilled the abstract class requirements will be able to keep their workflow, and any user who wants to be automatically backwards compatible can instead inherit from the new Unimplemented...ServiceBase and override functions as they implement them.

Another option would be to make the ...ServiceBase class concrete and have default implementations that simply raise NotImplementedErrors (this is what the main grpc library does). But this could break existing workflows.

I can also make a pull request with the changes if you decide that this should be implemented.

vmagamedov commented 1 year ago

Is it possible to make that upstream library a dependency of your library, so you can specify concrete versions range you support in your library?

I also was able to come up with this example:

def with_optional_methods(cls: ABCMeta) -> type:
    async def not_implemented(self: Any, stream: Any) -> None:
        raise GRPCError(Status.UNIMPLEMENTED)
    return type(cls.__name__, (cls,), {name: not_implemented
                                       for name in cls.__abstractmethods__})

@with_optional_methods
class Greeter(GreeterBase):

    async def SayHello2(self, stream: Stream[HelloRequest, HelloReply]) -> None:
              ^^^^^^^^^ redacted
        request = await stream.recv_message()
        assert request is not None
        message = f'Hello, {request.name}!'
        await stream.send_message(HelloReply(message=message))

async def main(*, host: str = '127.0.0.1', port: int = 50051) -> None:
    server = Server([Greeter()])  # type: ignore[abstract]
                                  ^^^^^^^^^^^^^^^^^^^^^^^^ added

It is maybe not very beautiful but works in runtime and in mypy. Would this be sufficient in your case?

njooma commented 1 year ago

The upstream library only publishes protobufs, and they ascribe to the philosophy that additive changes are not breaking since protobufs are supposed to be backwards and forwards compatible. So pinning concrete versions is not as helpful, especially since I sometimes want to support some new protobufs, but not others.

Your example solution works great! Thanks for that.

The reason I suggested creating a specific Unimplemented class is because that's how the official gRPC library functions (it actually only creates one class, and has all the methods default to raising NotImplementedErrors). So for now your solution works, but is there a reason not to implement the additional generated class?