Open stillcold opened 2 years ago
You could just concat two bytes to merge:
local a = pb.encode(...)
local b = pb.encode(...)
local result = a .. b
This is wondeful, but not what I mean. So I am very sorry about that I didn't give you a exact demo.
What I mean is:
local person_data = pb.merge_from("Person", chinese_data )
like a filter, a merger_from function just ignores the useless keys and keeps what we defined. In this way, it merge data for Person proto from a packet decoded from Chinese proto. The two protos may have similar keys. person_data is some thing like chinese_data, it is also a lua table, not in bytes. after do that, the content in person_data should be:
{
name = "ilse",
age = 18,
contacts = {
{ name = "alice", phonenumber = 12312341234 },
{ name = "bob", phonenumber = 45645674567 }
}
}
All keys in chinese_data but not defined in Person proto are gone.
A merge operation is more like:
-- well, this is another implement, but i think it is inefficient
function pb.merge_from(pb_type, data)
return pb.decode(pb_type, pb.encode(pb_type, data))
end
In my project, I am using a lua implement instead, but that is also inefficient. Thanks. :)
In my project, I am using a lua implement instead, but that is also inefficient.
We're using this, too. And it works perfect. It's may not be fatter to add a new routine to do so.
Because merging tables means traverse all table recursive, traverse the source message information and destination message information. That is just the most cost in encode/decode. So it may helpless for the performance.
I'll run some tests. I'll keep you posted.
Thanks.
Hi guys, is there any plan about supporting merge_from inteface?
The code below is a example.
As far as I can see, I have to use
pb.fields
to traverse all the fields and try to merge the values. But I don't think it is a good idea, any suggestions about the implements?