bnomei / kirby3-janitor

Kirby Plugin for running commands like cleaning the cache from within the Panel, PHP code, CLI or a cronjob
https://forum.getkirby.com/t/kirby3-janitor-plugin/23573
MIT License
89 stars 8 forks source link

How to avoid php max_execution_time timeout #47

Closed tideg closed 2 years ago

tideg commented 3 years ago

First of all thanks for this great plugin!

I build a panel-button and corresponding custom job to import data from json and copying external image-files to create new pages out of that. Because it´s a larger data-set I run into an php max_execution_time timeout. Is there any mechanism inside janitor to avoid that?

Thanks for any help in advance!

bnomei commented 3 years ago

you can try phps set_time_limit(0) before your code.

if your hosting does not allow this consider using janitor from the cli (terminal shell) on the server. trigger the janitor job in cli using a cron job (see wiki). https://github.com/bnomei/kirby3-janitor/wiki/Setup:-Secret-and-CRON

you can trigger cron jobs instantly but its a bit of a hassle (see no.2) https://stackoverflow.com/a/35138961

so you would need the job to do your logic and another to call as a webhook that triggers the first one using the code from stackoverflow.

alternativly you can try https://github.com/lukaskleinschmidt/kirby-terminal

bnomei commented 3 years ago

why cli/terminal? they usually do not have an max execution time or a veeeeery long one even on shared hosting servers.

tideg commented 3 years ago

Thanks for your suggestions @bnomei. I may try that. But am I correct in assuming that within this setup I am not able to output error-/success-messages from the logic-job into the panel (janitor-button)?

bnomei commented 3 years ago

i can not think of an easy way to achieve that with janitor within the panel. maybe try the kirby-terminal plugin instead?

tideg commented 3 years ago

As you suggested @bnomei, I gave kirby-terminal plugin a try. In the meantime, however, I have encountered another problem:

When calling janitor via CLI, user-permission-related operations are permitted since there is no authentication. Is there a way to call janitor via the CLI with authentication? Something like the secret which can be used inside API-Calls?

bnomei commented 3 years ago

kirby uses session for auth so thats not easy to replicate. what would be possible

1) one could allow basic auth in kirby config 2) then call janitor "webhook with secret"-url with curl but provide the basic auth headers themself (using curl) 3) inside janitor job then do auth based on kirbys basic auth

https://getkirby.com/docs/guide/api/authentication#http-basic-auth

curl --user name:password http://www.example.com

bnomei commented 3 years ago

to build the curl you could use a janitor job from the panel. something like this. totally untested but you should help getting you started.

callcurl:
   type: janitor
   job: callmycurlwithuser
   data: "{{ user.id }}"
'bnomei.janitor.jobs' => [
    'callmycurlwithuser' => function (Kirby\Cms\Page $page = null, string $data = null) {
       $user = kirby()->user($data);
      $url = "youroriginal.job/secret"
      $curl = "curl --user name:password -s " . $url . " > /dev/null"; // fill in $user->email() and password (but not $user->password())
       $success = exec($curl) !== false;
        return [
            'status' => $success ? 200 : 403,
        ];
    },
]
tideg commented 3 years ago

Thank you very much for your thoughts and help with this @bnomei! Unfortunately this is all very cumbersome and doesn´t feel smooth like I´d like it to have. I´m also not shure if it is possible at all to call a php-shell command from within PHP and therefore bypass the max_execution_time; which is what we are trying to achieve here, when I got that right. For now I reduced the set of processed data-entities and tell the user that there is more to import, which he then can do with further clicks on the janitor panel button. This works reliable but I´ll try out other ways when I have the time and will tell here if I find another good solution.

Thank you so far.

bnomei commented 3 years ago

cli usually has a much longer execution time. even on shared hosting. and if you a > /dev/null it should not wait when calling the command.