Closed catrone3 closed 9 months ago
current thoughts on this (high level atm) move the getfullGraph function into its own file, have that file run via cron or a bash script that is forever in a loop (this could be bad if done wrong). Then instead of returning the two variables, put each variable into a temp file. Where the function is now, have it just load the data from the temp file. This would move the work load of creating the the nodes and edges outside of the browser. I do wonder if we could do something similar with the main graph as well, although I would like to see how this would affect the load time of it first.
nvm, after some messing around. I suspect it will not quite be that simple.
my current idea looks something like this.
there is still the problem that on the very first load of the page, the linking need to be done and if this will take to long the fpm could run in a timeout. It is possible to move this action into another (child) process but this comes with security issues because php must be allowed to perform shell commands, which may be disabled in some environments (for good reasons)
providing a cron job can be a reliable solution in the docker setup, but for the non docker setup its to system depended (you dont have cron in windows environment)
if you would like a data set to test your ideas with that we know is already timing out fpm you can use mine. it is on github https://github.com/catrone3/shadowrunNotes
my current idea looks something like this.
- check if the metadata.json file has changed by calculating and storing the md5sum of it in a env variable
- if the file has changed, perform the linking and store it in a temp file (and also save the new md5sum)
- on page load, load the content from the temp file and only perform the (re)linking if anything has changed (md5sum differs)
there is still the problem that on the very first load of the page, the linking need to be done and if this will take to long the fpm could run in a timeout. It is possible to move this action into another (child) process but this comes with security issues because php must be allowed to perform shell commands, which may be disabled in some environments (for good reasons)
providing a cron job can be a reliable solution in the docker setup, but for the non docker setup its to system depended (you dont have cron in windows environment)
what about using the shell_exec command in php? you could run that with an & at the end to put it in the background.
what about using the shell_exec command in php? you could run that with an & at the end to put it in the background.
This would mean shell_exec (or other system calls) need to be enabled in the php config, which I want to avoid for security reasons.
this is updated on the dev branche, will merge it in the next days to main
fixed in 1.5.7, there is also a python script (https://github.com/secure-77/Perlite/blob/main/perlite/.scripts/create_GraphLinking.py) to build the necessary graph linking by a cron job if the php engine will run into a timeout
Describe the bug When the links get to large in number the system takes to long to load it all and times out
To Reproduce Steps to reproduce the behavior:
Expected behavior the page to load.
Screenshots If applicable, add screenshots to help explain your problem.
Additional context discussed it in the help section on discord