protocolbuffers / protobuf-javascript

BSD 3-Clause "New" or "Revised" License
370 stars 67 forks source link

Need to better document that toObject() is *not* proto3 JSON #95

Open markjen opened 8 years ago

markjen commented 8 years ago

proto3 JSON mapping specifies special output for some well known types (e.g. Timestamp, Duration). However, the generated _pb.js files for these types treats them as regular protobufs.

Repro code:

var tspb = require('google-protobuf/google/protobuf/timestamp_pb.js');
var t = new tspb.Timestamp();
console.log(t.toObject());

Expected output: 0001-01-01T00:00:00Z

Observed output: { seconds: 0, nanos: 0 }

xfxyjwf commented 8 years ago

This is a known issue. The JSON support for well-known types hasn't been implemented in javascript yet.

haberman commented 8 years ago

Yes, to be clear, toObject() is not proto3 JSON, it is a pre-existing JSON mapping that does not conform to proto3 JSON.

haberman commented 8 years ago

I suppose the only remaining issue here is that we need to better document that toObject() is not proto3 JSON.

pradyuman commented 8 years ago

Is there a toObject function that does conform to proto3 JSON? Currently, enums are not being translated with the "string" value and that's making my development a real pain.

haberman commented 8 years ago

No, proto3 JSON is currently not supported with JavaScript.

Currently there is not a lot of demand for this inside Google. And whenever we add a feature we have to make sure it doesn't increase code size for people who are not using that feature.

If there was a lot of demand for this from open-source contributors, we could consider how this could be added without causing code size problems.

pradyuman commented 8 years ago

So what's a good way to get around this limitation right now? I was thinking about sending protos over the wire to my react website, but this means I need to bundle the generated protos within my overall javascript bundle which increases the size of the overall payload in a way normal JSON would not.

I feel like if proto3 is going to be the next major evolution of protobuf, eventually a proper toObject function will need to be created. One of the features about proto3 that was supposed to be pretty big was interoperability with JSON. Without proper json encoding/decoding functions, it makes it incredibly hard to use grpc/proto3 without lots of work to patch over these kinds of limitations.

oseiskar commented 7 years ago

I'm also appalled at this missing feature. The documentation makes you believe in language-independent support for serialization in both binary protobuf and JSON. Don't like protobuf & GRPC in your API? No problem, proto3 has a built-in JSON-mapping providing an effortless conversion to "traditional" REST/JSON. Protouf would just define a schema and provide some validation.

But this does not really work in JavaScript since the JSON will be different from what you would get from a Java or Python back-end.

This would be understandable if it only affected some hacky types like Any, but all you need is an enum (AN_OPTION vs 1) or a Timestamp (ISO-8601 vs { seconds: 123, nanos: 456 }) and you're in trouble. I think this level of missing functionality would need a big not supported in all programming languages warning in right in the main documentation chapter on JSON support, not a nasty surprise hidden in the language-specific docs.

zefyrr commented 7 years ago

Agree with @oseiskar that documentation needs to be updated. How can we push for support of proto3 json in javascript? Code-size issues surely can be solved via 'require', no?

chadbr commented 6 years ago

Currently there is not a lot of demand for this inside Google. And whenever we add a feature we have to make sure it doesn't increase code size for people who are not using that feature.

If there was a lot of demand for this from open-source contributors, we could consider how this could be added without causing code size problems.

@haberman for what it's worth...

I've discussed using protobuf instead of json over http with countless people. The conclusion is usually 'Google does not seem interested'.

I think there is a bit of a double edged sword here -- the community doesn't show interest because Google doesn't show interest.

I, for one, have been looking for 1st class browser support for protobuf for several years.

Thanks, Chad

ghost commented 6 years ago

@haberman any update on this?

pjo336 commented 6 years ago

Plz?

mitar commented 6 years ago

I made a workaround, see it here.

ghost commented 5 years ago

We are not able to support JSON format correctly due to lack of descriptor support in JavaScript (#4309) :(

oseiskar commented 5 years ago

@haon4 Nonsense. A function like (protobufSpecString, protobufBinaryBlob) -> jsonString and its inverse (protobufSpecString, jsonString) -> protobufBinaryBlob can implemented in any language. It may need bigger refactoring in this library than convenient, but the possiblity to do it does not depend on any language-specific features, especially if you do not use types like Any. How about beginning by supporting things like Timestamp?

ghost commented 5 years ago

I am not talking about language specific features in JavaScript. In other language implementations, we have encoded the Descriptor in the Protobuf implementation. However, Descriptors can get very big, as they contain the names and options for all your fields, message, etc., so we initially didn't include that for JavaScript since it can significantly impact your JS code size. In order to support JSON format properly, the field name and its type, etc. need to be present, so we need to figure out someway to include Descriptors in JavaScript without significantly hurting its code size. Our team is evaluating this, but it is not an easy to solve unfortunately :(

doom-goober commented 4 years ago

For anyone desperately looking for a solution, protobufjs (https://www.npmjs.com/package/protobufjs) "supports" Protobuf from Javascript and can generate proto3 JSON. However, be warned that protobufjs takes pretty much the opposite design philosophy from googleprotobuf. That is, protobufjs generates almost no custom code or classes and is almost entirely Descriptor driven. Having worked with both, I much prefer googleprotobuf's design philosophy, but since they refuse to support JSON if you need JSON you may have to look elsewhere. Additionally, I should add that protobufjs is VERY popular, and has more than twice the NPM downloads of googleprotobuf, so protobufjs is clearly doing something right (even if it's sometimes clunky to work with.)

I suppose you could also mix protobufjs and googleprotobuf, since protobufjs can provide the descriptors (their descriptor model is super light weight... it's just data.) In fact, you could take only the data outputted from protobufjs' generator executable without using protobufjs in your product at all as @oseiskar suggests.

isherman commented 4 years ago

@doom-goober Are you sure that's the case? There are number of open issues (e.g. protocolbuffers/protobuf#1304) in the protobufjs repository about the lack of support for canonical proto3 <-> JSON mapping. Would love to know if you're experiencing otherwise.

doom-goober commented 4 years ago

Sorry I wrote that post a long time ago before I had much experience with Protobufjs. Protobufjs has two systems: a reflection based system and a JavaScript object system. You can get JavaScript object to JSON to be really close to proto3 json and supposedly if you use the correct flags you can get it exact. However, I used Protobufjs' reflection system in conjunction with JS object to JSON to generate proto3 json the way I want (it was also faster than converting objects before formats.)

TLDR; yes you are right Protobufjs doesn't quite support proto3 json out of the box. But because it supports reflection it is possible to write your own proto3 json.

On Mon, May 25, 2020, 21:14 Ian Sherman notifications@github.com wrote:

@doom-goober https://github.com/doom-goober Are you sure that's the case? There are number of open issues (e.g. protocolbuffers/protobuf#1304 https://github.com/protocolbuffers/protobuf/pull/1304) in the protobufjs repository about the lack of support for canonical proto3 <-> JSON mapping. Would love to know if you're experiencing otherwise.

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub <protocolbuffers/protobuf-javascript#95>, or unsubscribe https://github.com/notifications/unsubscribe-auth/ANQIS3JN6ZV275XTFIUC5PTRTM6ZZANCNFSM4CLIFQPQ .

cyberco commented 4 years ago

This is a rather unfortunate ommission, if you ask me. Making it really hard to use in a browser/javascript environment. As I have mentioned on StackOverflow as well I've been struggling to get my not-so-unusual use case working:

====

I've been struggling with passing a JSON with Message values back and forth between systems. Got a bit further, but I'm still not there. A Struct seems to be the way but even though the struct I'm sending looks perfectly fine, it is empty once received by the server.

Result is passed from a web browser (grpc-web) to a Python backend. The Python backend should serialize Result to JSON (to store it) and back again.

// proto
message Result {
    google.protobuf.Struct variables = 1;
}

// obj - Where variables would contain a (1 level deep) JSON with different types of values, e.g.:
{
    "key1": 1,
    "key2": true,
    "key3": proto_msg_a //instance of proto.MessageA
}

// code
struct = new proto.google.protobuf.Struct(obj);
req = new Request;
req.variables = struct;

Checking req.variables before sending shows that it's indeed a Struct with all the correct fields in it. But once the other end (server) receives it req.variables is an empty Struct. For testing purposes I tried an obj that is simply {'key': 'value'}, but the result was the same.

So then I tried proto.google.protobuf.Struct.fromJavaScript:

// code
struct = proto.google.protobuf.Struct.fromJavaScript(vars);
req = new Request;
req.variables = struct;

This works for a simple obj (e.g. {"key": "val"}), but for an obj with a proto message field (such as above) it resulted in :

struct_pb.js:875 Uncaught Error: Unexpected struct type.
    at Function.proto.google.protobuf.Value.fromJavaScript (struct_pb.js:875)
    at Function.proto.google.protobuf.Struct.fromJavaScript (struct_pb.js:941)
    at Function.proto.google.protobuf.Value.fromJavaScript (struct_pb.js:871)
    at Function.proto.google.protobuf.Struct.fromJavaScript (struct_pb.js:941)
    at Function.proto.google.protobuf.Value.fromJavaScript (struct_pb.js:871)
    at Function.proto.google.protobuf.Struct.fromJavaScript (struct_pb.js:941)
    at Function.proto.google.protobuf.Value.fromJavaScript (struct_pb.js:871)
    at Function.proto.google.protobuf.Struct.fromJavaScript (struct_pb.js:941)

Or can I, instead of going through all the troubles with protobuf/json in javascript, just use a map?

// proto
message Request {
    map<string, ?type?> variables = 1;
}

But what would ?type? then be if it the values can ben anything (proto.MessageX, string, boolean, etc)? One advantage is that the receiving end knows all the proto definitions it might receive.

I would really like to use proto messages in variables. The reason for picking protobuf/grpc was exactly this, being able to use the same type throughout our complete platform, but this seems to be blocking this goal. Did I miss something? What would you do?

nickwinger commented 1 year ago

still open ?