litespeedtech / lscache_wp

LiteSpeed Cache for WordPress
http://wordpress.org/plugins/litespeed-cache/
GNU General Public License v3.0
212 stars 108 forks source link

The cssjs directory keeps growing #291

Open closte opened 3 years ago

closte commented 3 years ago

We noticed that the Litespeed cache plugin is continuously generating a huge amount of disk usage in /wp-content/litespeed/cssjs directory.

Day 1 5.8M (after clearing all Litespeed cache): Day 2: 1.8G Day 3: 3.4G Day 4: 5.6G 7570 files

The WordPress cron does not clear these files, only Litespeed Clear All Cache option.

The complete website disk usage is around 1.5GB but over 5.4GB cssjs data and is growing every day. The website has a low number of posts/pages but 2000 WooCommerce products.

How do you handle clearing the cssjs files or when? Do you re-use the cssjs files if they are the same or generating a new combination for every URL?

closte commented 3 years ago

Update 1:

Most of those files are JS files, not CSS. It looks like you are re-using the same CSS files but generating new JS per URL. You should re-use the JS files too but it looks like you decided to generate a new file per URL because you merge the inline JS blocks that might be different from URL to URL?

If this is true, you should exclude the inline dynamic JS blocks and re-use the same merged/modified JS file however, this is not an easy task as it looks because the order of the JS blocks is very important.

usabe commented 3 years ago

Hello, Closte, thank you for the feedback and comments. Actually this is a known phenomena happened in the past. Please refer to this page for explanation and solution: https://docs.litespeedtech.com/lscache/lscwp/ts-optimize/#disk-space-filling-fast

MAGODMA96 commented 3 years ago

How many CDN mappings do you have active?

closte commented 3 years ago

Hello, Closte, thank you for the feedback and comments. Actually this is a known phenomena happened in the past. Please refer to this page for explanation and solution: https://docs.litespeedtech.com/lscache/lscwp/ts-optimize/#disk-space-filling-fast

We know but end-clients don't know this except if they are monitoring the disk usage and come to some strange conclusion.

closte commented 3 years ago

How many CDN mappings do you have active?

The issue has no relationship with the CDN features.

MAGODMA96 commented 3 years ago

I had the problem and it was related with the CDN mapping. The principal reason of the issue was that it had more than 1 CDN mapping, provoking an issue, because it only allows one.

closte commented 3 years ago

I had the problem and it was related with the CDN mapping. The principal reason of the issue was that it had more than 1 CDN mapping, provoking an issue, because it only allows one.

Maybe but for this use case, the website is not using the Litespeed CDN feature.

hi-hai commented 3 years ago

In v3.6.2 there will be a check and give a warning if records are more than expected in Page Optimize page https://github.com/litespeedtech/lscache_wp/blob/master/tpl/page_optm/entry.tpl.php#L16

We can add another check to automatically bypass generation if the amount is beyond a certain number if needed.

closte commented 3 years ago

I don't fully understand the code but I think it will not work well. You can have some cron that will compare the JS files and if one file is very similar to multiple JS files, you can show the warning.

Think like:

If the abc.js file has over 10 similar files with 95% similar text, show the warning. PHP function: https://www.php.net/manual/en/function.similar-text.php The check must be via cron because the website might have a lot of files.

closte commented 3 years ago

In v3.6.2 there will be a check and give a warning if records are more than expected in Page Optimize page https://github.com/litespeedtech/lscache_wp/blob/master/tpl/page_optm/entry.tpl.php#L16

We can add another check to automatically bypass generation if the amount is beyond a certain number if needed.

Check my last message.

LarsK1 commented 3 years ago

@hi-hai It's the same issue for me - using the cache with cdn and it allocated over 200GB in 3 days? Is there any possibility to resolve that issue?