Open Dieterbe opened 8 years ago
The only special feature of UnmarshalMsg
and DecodeMsg
(from a zero-alloc standpoint) is that they will use pre-existing fields in an object rather than allocating new ones. So, if you decode into the same object repeatedly, things like slices and maps won't be re-allocated on each decode; instead, they will be re-sized appropriately. In other words, mutable fields are simply mutated in-place.
The first thing I'd recommend is using []MetricData
instead of []*MetricData
, as the extra layer of indirection will have a serious impact on both external fragmentation and the number of objects in your heap. (Also, I'd recommend truncating and re-using that slice for each decoding step.) Remember that GC throughput is more a function of the number of objects (and number of pointers) on the heap rather than the total heap size. (map[string]string
is also a pretty heavy object from a GC standpoint.)
Also, I notice there are at least a couple fields in that data structure that use strings with only a few known values. You could replace those fields with interned strings so as not to allocate a new one each time it is decoded:
type InternedString int
const (
_ InternedString = iota
InternedFoo
InternedBar
InternedBaz
)
func (i *InternedString) UnmarshalMsg(b []byte) ([]byte, error) {
o, s, err := ReadStringZC(b)
if err != nil {
return b, err
}
switch msgp.UnsafeString(s) {
case "foo":
*i = InternedFoo
case "bar":
*i = InternedBar
case "baz":
*i = InternedBaz
default:
return b, fmt.Errorf("unknown string %s", string(s))
}
return o, nil
}
Let me know how that works for you.
Cheers, Phil
Thanks Phil, everything you said makes a lot of sense. btw on the topic of GC workload being largely correlated to number of pointers and strings, i recently brought up a question on the mailing list to see if there's a good way to get better insights into the workload. since things like profiling are so highlevel and coarse grained. (https://groups.google.com/forum/#!topic/golang-nuts/Q0rXKYjy1cg)
The first thing I'd recommend is using []MetricData instead of []*MetricData
can i do this while retaining compatibility with messages based on the previous code? i would like to try this but i'm worried about breaking parsing of older messages.
re your suggested code. i presume i would write the type declaration and define those consts, and then the UnmarshalMsg would be automatically generated like that, right? or do i have to provide that? it looks like this helps with keeping the strings as strings on the wire, but in my application runtime they would be iota's, wouldn't it make more sense to keep them iota's on the wire as well? (does that happen automatically if my struct's member is declared as a InternedString ?)
You would have to provide the interning manually. But yes, if you can afford to make those strings int
s on the wire, all the better.
Hi, I have an app where about half of the alloc space is coming from
_, err = out.UnmarshalMsg(b)
(where b is a pre-existing byteslice i want to decode) and it's triggering GC too much.I have read https://github.com/tinylib/msgp/wiki/Getting-Started a couple times, specifically the section on the Marshaler and Unmarshaler interfaces. The former is pretty clear but i'm struggling with the latter. it's described as "simply the converse" and talks about how you can use the return value in a clever way. But it doesn't really mention anything about the input slice. is there anything noteworthy about it? Especially in context of zero-allocation? For example am i supposed to supply an input slice that uses the len to cover the region of data to decode, but additional cap as temporary space for decoding? Or is it ok for cap to == len because it doesn't really need "temporary" space?
for the record, most of the allocations are seeing are due to ReadStringBytes (#145) but i'm also seeing some others like
which i wonder if they can be avoided?
(this is while unmarshaling a MetricDataArray, which is defined in https://github.com/raintank/raintank-metric/blob/master/schema/metric.go)
thanks!