leikind / wice_grid

A Rails grid plugin to create grids with sorting, pagination, and (automatically generated) filters
MIT License
545 stars 214 forks source link

Server running out of memory when generating CSV #320

Closed adiakritos closed 8 years ago

adiakritos commented 8 years ago

I have over 5 million records in this database, and sometimes the user will try and export a cvs with close to 1 million records in it. Although when they do that the server runs out of what I think is RAM, and then gives this error about the application being unable to complete the request.

initialize_grid call code:

    @records = initialize_grid(
      Record,
      conditions: ["day_id = #{params[:id].to_i}"],
      name:                 'records',
      enable_export_to_csv: true,
      csv_file_name:        params[:id]
    )
    export_grid_if_requested

view helper code:


<%= grid(@records) do |r|

  r.column name: 'Agency', attribute: 'agency' do |day|
    day.agency
  end

  r.column name: 'Class', attribute: 'classcod' do |day|
    day.classcod
  end

  r.column name: 'Dept.', attribute: 'department' do |day|
    day.department
  end

  r.column name: 'Address', attribute: 'location', class: 'control-col' do |day|
    day.location
  end

  r.column name: 'Zip', attribute: 'zip' do |day|
    day.zip
  end

  r.column name: 'NAICS', attribute: 'naics' do |day|
    day.naics
  end

  r.column name: 'Contact', attribute: 'contact', class: 'control-col' do |day|
    day.contact
  end

  r.column name: 'MM-DD', attribute: 'date' do |day|
    day.date
  end

  r.column name: 'Year', attribute: 'year' do |day|
    day.year
  end

end -%>

Here is the log trace from Nginx


[ 2016-08-08 13:24:50.0171 967/7fa479b2f700 age/Cor/CoreMain.cpp:819 ]: Checking whether to disconnect long-running connections for process 1745, application /var/opt/www/big-nerd/current/public
App 1921 stdout:
App 2041 stdout:
[ 2016-08-08 13:26:50.0583 967/7fa479b2f700 age/Cor/App/Poo/AnalyticsCollection.cpp:61 ]: ERROR: Cannot fork() a new process: Cannot allocate memory (errno=12)
  Backtrace:
     in 'void Passenger::ApplicationPool2::Pool::realCollectAnalytics()' (AnalyticsCollection.cpp:183)
     in 'static void Passenger::ApplicationPool2::Pool::collectAnalytics(Passenger::ApplicationPool2::PoolPtr)' (AnalyticsCollection.cpp:56)

[ 2016-08-08 13:26:52.3923 967/7fa47335e700 age/Cor/CoreMain.cpp:819 ]: Checking whether to disconnect long-running connections for process 1921, application /var/opt/www/big-nerd/current/public

Is there a way I can split up the requests or make multiple CSV files with the default code in the gem? I've already increased the size of my server's RAM and that did work for the request they were making, but not for larger requests.

leikind commented 8 years ago

If you want to generate a CSV out of 5 million records you'd better write your own background job generating the file, this plugin won't do it.