Closed dhombios closed 4 months ago
If you look at the read()
documentation, you see you can read large files in chunks. In the sample code snippts look at test_reading_in_chunks()
.
Thanks for your answer. That was what I was looking for
If the number of rows in the actual file is smaller that the amount of rows asked to the read function, does read still load the dataset or does it raise an error?
It reads until num_rows is read or end of file. No errors will be thrown.
Sometimes the amount of data that needs to be processed is bigger than the amount of ram available. For that cases, Polars has a (still in development) streaming mode, that reads, processes and saves small chunks of data.
Is it possible to achieve something similar with this library?