Closed jinhuiDing closed 3 years ago
I've tested and the only problematic memory should be a writer used with the constructor new DBFWriter(InputStream)
. This constructor forces store all file in memory (and not in a efficient format). This is not easy to fix for streams, some info like number of records is stored at the begining of the file/stream, but isn't known until the last record is provided.
Using DBFReader
and DBFWriter(file)
should be very low in memory usage (only one row a time)
are you keeping a copy of all records in memory?
Maybe you can try increasing the maximum memory for your process if you have enougth memory.
example java -Xmx4G
I've prepared an example creating a very big file (1,5gb) and reading, without problems in a configuration of -Xmx64m
Thank you for your reply, you reminded me. Yes, you are right. I shouldn't save data in memory. Now that the modification has been made, the test is possible. thank you very much
GC overhead limit exceedWhen the program reads the DBF file, an exception will occur if the file is large: GC overhead limit exceeded. Is there any API or other solution? I hope to give you a suggestioned