Gno: An interpreted, stack-based Go virtual machine to build succinct and composable apps + gno.land: a blockchain for timeless code and fair open-source.
By modifying the print-runtime-metrics option of the test flag to analyze memory allocations. I discovered that memory is not fully released and continue to accumulate even after the lifecycle of the objects/variables ends.
This analysis primarily compared uint64, uint256 and math/big's bigint types.
Test Methodology
Main comparison targets: uint64, uint256, math/big's bigint
Method: Comparison of memory usage for identical operations
Tool: Modified print-runtime-metrics flag (may differ from actual usage)
Modify the Allocator type as follows to see how much memory is allocated for each item when the flag is used.
type Allocator struct {
maxBytes int64
bytes int64
opAllocs map[string]uint64
mu sync.Mutex
}
Problem Analysis
1. Memory Leaks in Nested Scopes
Native types(uint64): No issues (about 0.04% memory residue)
func TestAcc10Uint64(t *testing.T) {
{
res := 0
for i := 0; i < 10; i++ {
res += 1
}
println(res) // res: 51.0kb alloc
} // <- res is deallocated here
res2 := 0
for i := 0; i < maxLoop; i++ {
res2 += 1
}
println(res2) // res2: 51.0kb alloc
} // total 52.9k alloc (+1.9k)
uint256, bigint: Significant memory residue after inner scope termination
func TestAcc10Uint256(t *testing.T) {
{
res := Zero()
for i := 0; i < 10; i++ {
res.Add(res, One())
}
println(res.ToString()) // res: 124.2kb alloc
} // <- res is deallocated here
res2 := Zero()
for i := 0; i < maxLoop; i++ {
res2.Add(res2, One())
}
println(res2.ToString()) // res2: 123.5kb allocs
} // total 197.9k allocs (+74.4kb)
func TestAcc10BigInt(t *testing.T) {
{
res := big.NewInt(0)
for i := 0; i < maxLoop; i++ {
res.Add(res, big.NewInt(1))
}
println(res.String()) // res: 61.8kb alloc
} // <- res is deallocated here
res2 := big.NewInt(0)
for i := 0; i < maxLoop; i++ {
res2.Add(res2, big.NewInt(1))
}
println(res2.String()) // res2: 61.3kb alloc
} // total 73.6kb allocs (+12.3kb)
2. Memory Accumulations in Loops
in uint256 heap allocations are increase of 1.14kb per 10 iterations in average.
Iterations
Allocation (kb)
10
1.7
20
3.1
30
4.5
40
6.0
50
7.4
Memory Usage Comparison by Type (uint: kb)
Iterations
uint64
uint256
bigint
0 [^1]
50.5
56.9
51.4
10
52.9
198.7
62.8
20
52.9
324.6
72.2
30
52.9
450.5
81.6
40
52.9
576.4
90.9
50
52.9
702.3
100.3
The main cause of memory leaks in loops are estimated to be as follows:
Object Creation and Absence of GC
In uint256 and bigint operations, new objects are likely to be created to store the result for each operation.
Accumulation of Temporarty Objects
Temporary objects created in each iteration accumulate in heap memory without being immediately released.
In environments without GC, these objects are not automatically cleaned up.
3. Memory Management Characteristics by Type
uint64 (native type): Stable memory usage
uint256: Rapid memory increase due to object creation
Reason: fixed size 256-bit allocation, frequent new object creation, new object declaration for pointer operation safety
bigint: Gradual memory increase due to dynamic allocation
I also checked for similar behaviour in stdlib, such as the strings package, and this was also experiencing the same issue.
func TestAccumulateStrings(t *testing.T) {
{
var builder strings.Builder
for i := 0; i < maxLoop; i++ {
builder.WriteString("Hello")
}
result := builder.String()
println(len(result))
} // 103.2k
var builder2 strings.Builder
for i := 0; i < maxLoop; i++ {
builder2.WriteString("World")
}
result2 := builder2.String()
println(len(result2)) // 102.7k
} // 156.4k
Conclusion
Memory leaks occur when using uint256, bigint and standard library objects like strings.Builder in environment without GC or other memory management systems.
Looking at the ownership.go file, it appears that a reference counting method is applied to manage objects, but it seems to have limitations.
This can lead to performance degradation and increase gas cost, necessitating the adding appropriate memory management strategies. we might consider RAII, or GC as suggested previously.
Overview
By modifying the
print-runtime-metrics
option of thetest
flag to analyze memory allocations. I discovered that memory is not fully released and continue to accumulate even after the lifecycle of the objects/variables ends.This analysis primarily compared
uint64
,uint256
andmath/big
'sbigint
types.Test Methodology
uint64
,uint256
,math/big
'sbigint
print-runtime-metrics
flag (may differ from actual usage)Modify the
Allocator
type as follows to see how much memory is allocated for each item when the flag is used.Problem Analysis
1. Memory Leaks in Nested Scopes
Native types(uint64): No issues (about 0.04% memory residue)
uint256
,bigint
: Significant memory residue after inner scope termination2. Memory Accumulations in Loops
uint256
heap allocations are increase of 1.14kb per 10 iterations in average.Memory Usage Comparison by Type (uint: kb)
The main cause of memory leaks in loops are estimated to be as follows:
Object Creation and Absence of GC
uint256
andbigint
operations, new objects are likely to be created to store the result for each operation.Accumulation of Temporarty Objects
3. Memory Management Characteristics by Type
I also checked for similar behaviour in stdlib, such as the
strings
package, and this was also experiencing the same issue.Conclusion
Memory leaks occur when using
uint256
,bigint
and standard library objects likestrings.Builder
in environment without GC or other memory management systems.Looking at the
ownership.go
file, it appears that a reference counting method is applied to manage objects, but it seems to have limitations.This can lead to performance degradation and increase gas cost, necessitating the adding appropriate memory management strategies. we might consider RAII, or GC as suggested previously.
Related
266
1788
[^1]: State after object creation only.