Closed RadianttK closed 6 months ago
I'm aware of this but I'm not planning on changing this any time soon. I'm not really optimizing for the bad internet use case currently. Again, I want to stress that the search is entirely client sided. The reason it works is because it loads everything at once. There is no server side search component, the API's search mechanism is completely different from the one on the site.
That being said:
For logged out users there's already caching and compression set up so it's not a big deal for those users. One thing I might explore is server side on-the-fly compression to reduce the amount of data being transferred regardless of being logged in or out, but other than that I'm not sure what else to do in the interim.
Yeah, I agree deferred loading would not work with client side searching. The searching would need to be moved to the server side, although I imagine that would be a fairly large undertaking.
I didn't realise that the caching is already being done for logged out users.
I've enabled on-the-fly compression for all responses as a trial. I'm not sure it's giving me much of a benefit personally but it might be better for people with subpar connections:
Before:
After:
You'll see that for me it's actually slower (in terms of DOM load), I suspect that's because the server has to do more work but it's negligible so it doesn't matter much.
Hmm, interesting. For me it reduced the load time from ~7 seconds to ~2.5 seconds, the same as when I am not logged in. That's quite a large improvement!
I tried pinging jimaku.cc
just in Command Prompt and got an average round trip time of 237ms, so I assume it's basically as optimised as it can get now.
Thanks!
Yeah, ping is another factor. Right now the server is hosted in US Central so for me the ping isn't too bad (around 25ms). Anyway, I'm glad this helped your loading speeds.
Context
ctrl + f
.Ideas for possible solutions
Deferred loading
Data caching
Incremental loading
Data pagination
Server-side caching