georgysavva / scany

Library for scanning data from a database into Go structs and more
MIT License
1.24k stars 67 forks source link

Pgxscan uses a lot of memory #75

Closed dmitriy-dev closed 2 years ago

dmitriy-dev commented 2 years ago

My app downloads db rows to csv file. I tried to use pprof to understand why my app uses a lot of memory. I see it:

image

I tried to compare with the same package for MySQL - github.com/blockloop/scan And here I have this result: image

It is two time less memory usage. Tell me please How I can optimize my app to use less memory when I work with this package?

georgysavva commented 2 years ago

Hey! Twice the difference with the other library looks like an implementation detail. The real thing here is that you want to download a lot of rows. And with both libraries scanning them into a slice all at once will waste a lot of memory. In your case, you need to use RowScanner type to be able to scan and handle (save to csv file) every row one by one. See example here: https://github.com/georgysavva/scany/blob/b4db33e7332ae6f757006c9b984fac35aa963b8a/pgxscan/example_test.go#L89-L114 On the line 109 you can call deal with data from the individual row whatever you want, e.g. save it to the csv file.

dmitriy-dev commented 2 years ago

Ok, thanks. I'll try it