Currently data is transformed to array structure first, next it is passed to json_encode method - so basically we need to have same data in 3 places (original object, array structure, json).
Possible fix:
We can try to limit amount of data passed to array by using JsonSerializable that provides nice way to prepare data in small batches. Possible cut points is objects or arrays - I've used array as it was simpler for PoC.
Results: - memory went down from 45MB to 13MB. It does not affected execution time.
[ ] exclusions based on the depth / path - we are loosing metadata stack
[ ] exclusions of empty arrays
[ ] circular reference (Segmentation fault)
[ ] support for iterators
But, what is bit unexpected, the rest seems to be working nice. Can you see any potential pain points for such solution? Do you think that it is worth to invest more time to improve it?
Current issue:
JsonSerialiser is using 3 times more memory compared to XML serialisation:
Currently data is transformed to array structure first, next it is passed to
json_encode
method - so basically we need to have same data in 3 places (original object, array structure, json).Possible fix:
We can try to limit amount of data passed to array by using JsonSerializable that provides nice way to prepare data in small batches. Possible cut points is objects or arrays - I've used array as it was simpler for PoC.
Results: - memory went down from 45MB to 13MB. It does not affected execution time.
Issues that needs to be solved:
But, what is bit unexpected, the rest seems to be working nice. Can you see any potential pain points for such solution? Do you think that it is worth to invest more time to improve it?
Best, scyzoryck