Open salilsaifuddin opened 2 months ago
pg_ivm cannot perform incremental maintenance efficiently for large write to a underlying table because the overhead of immediate maintenance would be large in this case.
If you would like to insert large data to a underlying table, I recommend to disable immediate maintenance by executing refesh_immv() function with with_data = false before inserting data. After that, call refresh_immv with with_data = true to refresh the view data and enable immediate maintenance. Whether the refresh takes longer time than the immediate maintenance or it takes shorter will depend on the size of data and the view definition, but it would be hard for now to predict it a priori.
Is it possible to ingest data as a batch using pg_ivm? I am looking for a way to efficiently handle large volumes of data and would like to know if pg_ivm supports batch processing.
@ibhaskar2 ,@Jamal-B, @yugo-n , @tatsuo-ishii , @hanefi