Basically it's requesting the URL directly and the user agent is set to "cachewarmer". Checking my access logs, I can confirm that the request happens.
I am running this script on the same server as Magento is installed on. I have added my custom user agent to the "Crawler User Agents" field. I have applied the VCL after making these changes.
Yet still when I am running the script to access URLs and afterwards open such an URL in the browser, the URL takes quite a while to load, and the "age" header is always set to a very low number (between 0 and 3). Of course I have accessed another URL (e.g. the homepage) before trying this one, to ensure that I have the necessary cookies in my browser session.
It seems that the URLs requested by my custom PHP script are not being cached. When I access these URLs in my browser twice, I get the cached version on the second request, as expected.
I have built my own simple PHP script to warm the cache as I can't use Siege on my server. Here's the simplified version of the crawler script:
Basically it's requesting the URL directly and the user agent is set to "cachewarmer". Checking my access logs, I can confirm that the request happens.
I am running this script on the same server as Magento is installed on. I have added my custom user agent to the "Crawler User Agents" field. I have applied the VCL after making these changes.
Yet still when I am running the script to access URLs and afterwards open such an URL in the browser, the URL takes quite a while to load, and the "age" header is always set to a very low number (between 0 and 3). Of course I have accessed another URL (e.g. the homepage) before trying this one, to ensure that I have the necessary cookies in my browser session.
It seems that the URLs requested by my custom PHP script are not being cached. When I access these URLs in my browser twice, I get the cached version on the second request, as expected.
Is there something I'm overlooking here?