Open drowzy opened 1 year ago
How about this shell script? Put it in the PATH as protoc-gen-elixir
and make sure you run protoc
from within your project.
#!/bin/sh -e
(
mix deps.compile
) 1>&2 </dev/null
exec mix eval --no-compile 'Application.ensure_all_started(:protobuf) ; Protobuf.Protoc.CLI.main(System.argv())' "$@"
All you need for this to work is to have done mix deps.get
, which is a pretty standard requirement anyways.
For your specific use-case, you might need to throw in a transformer module. If you bundle that in elixir-grpc
, it'll get compiled as a dependency and should be available.
Apologies if this comes across as kind of severe, but... it really goes against the grain of the protobuf
ecosystem to wrap protoc
. It's very intentionally the standard interface. The entire point is to give a single point to build anything, for any language, in a single run, with any number of other plugins. The plugin architecture is intended to give a single place where language concerns can be contained and to do it using the most flexible and standard interface there is—running a subprocess.
Consider the examples for all of the supported languages. None of them have any preprocessing or wrapping like this to provide extensions to code generation so far as I can tell. The process is pretty much identical everywhere:
About the only thing different between any of them is the output location. In those cases it's just to conform to the strict directory structure requirements the languages have. And in every case, consuming options for the custom extension is similarly consistent. Everyone who has already used protocol buffers in other languages is just going to wonder why Elixir is making things weird.
And some kind of arbitrary rule for abstract gain, there are very real benefits to doing it that way. Check out the Buf project. Note their plugins protoc-gen-buf-breaking
and protoc-gen-buf-lint
. If we wrap things in a Mix
task as proposed, we have to provide a trap-door to ensure that everybody using every other plugin can pass through their configuration. And you can bet that it'll be fragile if we depend on any configuration values we pass in.
It seems the core problem here is distributing it as an escript
. Let's attack that directly.
Given the assumption that you'll be running protoc
from within your project directory, that shell script runs the code from within the project. It's available as a dependency, just like the Mix
task would be. And, if we wanted to support running from an arbitrary location, we could easily allow it to be set with an environmental variable. Or maybe do some fancy args parsing to use the output location (which presumably would be within the Mix
project you're building for).
There's a bit of a chicken-and-egg thing if you need something from your app itself (like, say, a transformer module). For the people that really want that, I chose above to at least compile dependencies. That way you can, in a pinch, you can put your module in a dependency package and call it a day.
For context: https://github.com/elixir-grpc/grpc/issues/274
I'm working on http/json transcoding for grpc and basically want to populate
MethodDescriptors
options with an extension during compilation.The recommended approach is to install the escript globaly and use the executable from
protoc
it's hard to see a way to provide extensions that needs to be loaded during compilation in order to to populate__pb_extensions__
correctly without creating a newprotoc
plugin.Having a complementary
mix task
that callsprotoc
usingdescriptor_set_out
would allow compilation to be executed in the context the local project instead of through a global executable. This is the approach taken in protox.I would think that most projects already contains some script which calls
protoc
to compile their.proto
files. This would remove this indirection and at the same time enable:Extensions in the current project would be picked up automatically by
Protobuf.load_extensions()
(or could be provided as an argument toprotobuf.generate
).No potential version difference between
protobuf
installed in the project and the escript installed globally.Allow easier integration into the codgen. Generators could be modules provided to
protobuf
instead of require that they are defined inprotobuf
.Proposal
Which generate a
protoc
call like:This would output a
FileDescriptorSet
into a temporary file. Combined with the parameters provided throughprotobuf.generate
it's enough to create aCodeGeneratorRequest
and perform the same logic as is done inProtobuf.Protoc.CLI
. The resultingCodeGeneratorResponse
is used to write files to disk.What do you think? I you think it sounds reasonable I can send a PR :).