oliveiraped / Reportula

Reportula is a php based web program that provides you a summarized output stats of Bacula Backups jobs, clients, volumes and director that have already run. It obtains its information from your catalog database. This is a fairly high level bacula management tool. Here are a few points that one user made concerning this important tool. It is web-based so can be accessed from anywhere. It packs a phenomenal amount of information into a single web-page – that I credit as being very good design! Features Display last jobs status Display volumes usage by pool Full supported MySQL, PostgreSQL databases. Show Jobs which executed with errors last day, week, month Show a condition of your Volumes Show terminated Jobs Search Jobs on several conditions The detailed information on Pools, Volumes, Storages and Clients Search options List the files stored on job Acl Supports and Login users support Integration with Active Directory or Ldap servers
http://www.reportula.org
GNU General Public License v3.0
25 stars 9 forks source link

fails loading big job #22

Open scambra opened 10 years ago

scambra commented 10 years ago

If I try to see status of a big job, 1.54G and 24401 files, it fails. I had to increase php execution time, because it was failing with maximum execution time of 30s. But now, it gives this exception: https://gist.github.com/scambra/ba3e423f4d51b6cbbd99

This is the backtrace I got, I don't know why there is no line numbers:

42. Illuminate\Database\QueryException
…/­vendor/­laravel/­framework/­src/­Illuminate/­Database/­Connection.php 625
41. Illuminate\Database\Query\Builder insert
<#unknown>0
40. Illuminate\Database\Eloquent\Builder __call
<#unknown>0
39. Illuminate\Database\Eloquent\Builder insert
<#unknown>0
38. Illuminate\Database\Eloquent\Model __call
<#unknown>0
37. app\models\Filessearch insert
<#unknown>0
36. app\controllers\FilesController files
<#unknown>0
oliveiraped commented 10 years ago

The Model of that process is like this :

  1. User Click Job Status
  2. Query File table and copy the jobid filesnames to i Temporary Table Called filesearch
  3. Datatables will open using the jobid on the filesearch table.

This away you are able to use datatables "next" and "previous" feutures faster because is using the filesearch table not the Bacula File Table.

In Large deployment Bacula infrastructures the File Table is HUGE, to query information on that table, it will take i lots of time.

You did well changing the maximum execution time of 30s to bigger value on php. You change as well the max memory usage of php script as well.

About the QueryException, i will check on that and tell you more information

For example i have i job that stores 1 Terabyte of information, and i never had the chance to make query on that job with any bacula webgui.

scambra commented 10 years ago

What do you think about using INSERT INTO filessearch SELECT, although raw insert sql must be used:

diff --git a/app/controllers/FilesController.php b/app/controllers/FilesController.php
index 697ec69..5d70359 100755
--- a/app/controllers/FilesController.php
+++ b/app/controllers/FilesController.php
@@ -43,12 +43,9 @@ class FilesController extends BaseController
             $files = Files::select(array($this->tables['path'].'.path', $this->tables['filename'].'.name as filename','jobid'))
                   ->join($this->tables['filename'],$this->tables['file'].'.filenameid', '=', $this->tables['filename'].'.filenameid')
                   ->join($this->tables['path'],$this->tables['file'].'.pathid', '=', $this->tables['path'].'.pathid')
-                  ->where('jobid','=', $job)->remember(10)->get();
+                  ->where('jobid','=', $job);

-            $files = $files->toArray();
-            if (!empty($files)) {
-                $t= Filessearch::insert($files);
-            }
+           $filessearch->getConnection()->insert("INSERT INTO ".($filessearch->getTable())." (path, filename, jobid) ".$files->toSql(), array($job));
         }

         /* Mostra o log do Job */

It's working really fast here, I can send a PR if you like it.

oliveiraped commented 10 years ago

If you remove this line, you are going to loose the Orm Laravel cache feature. ->where('jobid','=', $job)->remember(10)->get();

This Line did you tested on Mysql and Portgres ? $filessearch->getConnection()->insert("INSERT INTO ".($filessearch->getTable())." (path, filename, jobid) ".$files->toSql(), array($job));

scambra commented 10 years ago

I only tested on postgres, although it's a quite standard SQL.

I know I lose laravel cache, but those records are not loaded in laravel anymore so I don't think I lose anything important. Previously it took more than 30 seconds, and then an exception was raised, now it takes few seconds only, less than 30 seconds. I don't think cache it's important here.

I will try to install on mysql and test

scambra commented 10 years ago

Probably I can't test on mysql until next week, I need to try on a vm. I don't want to mess my production server.