Closed hellt closed 4 years ago
Yep, that is exactly the kind of problem that protoreflect was built to solve.
A good example app that does something a lot like that is grpcurl
: https://github.com/fullstorydev/grpcurl.
It must do that to speak RPC at all since it is a dumb tool that doesn't know about any gRPC services other than the descriptors you give it. It then uses service and descriptors to issue RPCs and -- like the problem you are trying to tackle -- it uses message descriptors and dynamic messages to build requests and parse responses.
You can even use grpcurl
as a Go library, which may help your case: if your command-line tool supports -proto
and -import_path
flags, you can provide those values to grpcurl.DescriptorSourceFromProtoFiles
(doc). The resulting "descriptor source" makes it easy to then look up descriptors by name.
With a message descriptor, you can then call dynamic.NewMessage
, and it has a method for then parsing the protobuf-binary-encoded bytes.
BTW, why not use an Any
for this particular situation? Libraries can better support Any
than just a bytes
field with an encoded proto since the Any
also includes the name of the encoded message, allowing generic tools to understand the schema and possibly help you decode it.
Also, you actually do not need to use protoreflect for this sort of chore.
The Go runtime for Protocol Buffers recently released a "v2" of the runtime library, and it includes all of the support you need for descriptors, protobuf reflection, and dynamic usage: https://blog.golang.org/protobuf-apiv2
Thanks @jhump! As the newcomers to the reflection side of protos it was not quite clear that its possible, since the proto/grpc examples stop with the code generation technique.
Its exactly after your GopherCon talk on grpcurl we had an idea to look one more time to that problem. And finally we were able to craft a likely unoptimal solution that at least produces the result. Thanks to jhump/protoreflect
.
We also tried the proto api v2, but didn't get any positive results, now when we have a working prototype built with jhump/protoreflect
we might later try to see if proto v2 api can be used as well.
We are not using proto.Any
, because we do not control the target, and one target uses proto.Any
, other proto_bytes
, so technically we need to support both.
I will close this issue, as the question has been answered, and we will start to dig in jhump/protoreflect
more to get to an optimal solution for our gnmic
CLI client.
FWIW, the new API v2 for protobuf provides everything you need except for the ability to parse descriptors from source. And since I haven't yet managed to land #354, interop between this repo's protoparse
package and the API v2 stuff is not perfect. So it may be best to stick with this protoreflect repo until I can make some more changes that will make it easier to use both together.
Hello @jhump Thank you for the
protoreflect
package and the information you shared with the community on proto reflection in general.We are working on the the a CLI client that needs to talk to an arbitrary gRPC target over a standardised gNMI interface. Reading through the package docs it was not clear to us if we can successfully use
protoreflect
to solve a particular use case, maybe you will be able to tell us if its possible at all?The targets that our tool talks to can return the data in proto encoding. Consider the example below, where the message field proto_bytes contain the proto-encoded message.
Considering the generic applicability of our client, we can't allow ourselves to use code generation and embedding in the tools binary, as the proto files are only known to a user at a runtime.
We would like to know if we can leverage
protoreflect
package to decode the proto-encoded messages using the.proto
files supplied to a tool at the runtime?In other words, does the following workflow seem realistic?
Maybe there is an example of an app similar to that? Or maybe its not feasible at all? Any pointers or comments appreciated.