Closed philwareham closed 3 years ago
stream_copy_to_stream()
or, failing that, a homegrown fgets()
(or equivalent with a streaming context) might well be faster. Usually these things are only of benefit for large files, and this content is nowhere near what anyone would call 'large'. The delay may well be attributable to the round-trip connection rather than streaming, but it's gotta be worth a shot. I'll try a few tests locally and see what happens.
EDIT: And we're already using a streaming context with file_get_contents()
so I'm not sure if there's any speed benefits to be had here.
@rwetzlmayr hi Robert, do you know how the above is measured by customer? i.e. Is it based on hits to the endpoint HTML?
I'd like to cache the HTML for a period of time if I can, but not if it is based on direct hits.
@philwareham I do not know whether this is measured by the customer but I don't think so...
Thanks Robert, I will test some short caching of the HTML (say, 10 minutes) and see if it improves matters.
OK, caching that item speeds the page up massively; was ~560ms, now ~110ms. Happy with that. If we can any kickback from customer I can revisit (but should hopefully be fine).
We've ascertained that fetching the HTML for the web informer widget causes around 300ms of lag on the initial server response - pushing us into a non-pass on Core Web Vitals. I'd like to reduce that lag if possible but we can't cache the HTML, for reasons.
Currently the code to fetch this HTML is:
I'm wondering if there is any performance gain from using
stream_copy_to_stream
instead offile_get_contents
?@bloke what do you think, and if so can you give me a pointer on the code needed?