Hi, i am having a pandas dataframe with around 30 lakhs but not able to write large dataframe to .xpt files. Although the same script works for other data frames having 18-20 lakhs of records without modifying anything .
While writting am using df.from_dataframe() function and while writting to the xpt file suddenly after couple of minutes i get "KILLED" message on the console.
System - RHEL Linux
Python - 3.7
Pandas version - 1.2
Using xport module with .v56 subpackage while writting.
Need guidance in below areas
Is this a case of memory leak
any chances of bad data in dataframe
anything related to process of writing large dataframes to .xpt
What could be correct approach in debugging a single built-in function ??
Hi, i am having a pandas dataframe with around 30 lakhs but not able to write large dataframe to .xpt files. Although the same script works for other data frames having 18-20 lakhs of records without modifying anything . While writting am using df.from_dataframe() function and while writting to the xpt file suddenly after couple of minutes i get "KILLED" message on the console. System - RHEL Linux Python - 3.7 Pandas version - 1.2 Using xport module with .v56 subpackage while writting.
Need guidance in below areas
What could be correct approach in debugging a single built-in function ??