ritstudentgovernment / PawPrints

PawPrints petition application for the RIT community.
https://pawprints.rit.edu
Apache License 2.0
16 stars 13 forks source link

minimize the number of requests for ALL pawprints that are made to the backend #156

Closed MoralCode closed 1 year ago

MoralCode commented 1 year ago

This helps improve the performance of search.

It turns out that a request for ALL pawprints was being sent for every character the user typed in the search box.

This might not be the only problem, but having to process that much needless extra data may affect performance

This fix changes it to only run a search when users hit enter.

This is a draft because it needs more work so as not to break existing users workflows when they expect that searched happen automatically without having to hit enter. We need a way to reliably detect when users stop typing so we can send ONE request instead of one per character. Maybe each keypress resets a 2 second timer and if the timer hits 0, THEN the request is sent?

This might be most useful in conjunction with @Sma-Das's branch to fix thesearch

Sma-Das commented 1 year ago

Something I'm aiming to do with a caching mechanism is to preserve the "responsiveness" of searching character by character.

If there is excessive overhead from polling the cache - this change can be implemented

MoralCode commented 1 year ago

are you trying to cache in the frontend or the backend?

it seems to me (given that the frontend already requests every pawprint) that all the actual searching and filtering is already happening on the frontend, so maybe the app can just detect when the user clicks into the search box, fetch all the pawprints once, and then search over that with each character typed.

alternatively, if the goal is to try and simplify the frontend and maybe allow clients to perform searches, maybe a more proper way to do it is to have a dedicated search command in either the REST or websocket APIs that eventually becomes some call against an indexed database or caching/search layer like elasticsearch/opensearch (havent used personally, just heard of it).

Just some ideas

Sma-Das commented 1 year ago

When I examined petitions.list in the debugger (Chrome/MacOS) I observed that every new query (character) would send a WS command and would repopulate petitions.list. The goal would be to create a cache with a time expiry of ~5 minutes.

I haven't tested the exact delays with the specifics of the search function and optimizations within which may potentially yield a decent bump in performance.

In respect to a caching or indexing database - while it's a good idea I feel it's unneeded complexity. The volume of petitions currently don't require that level of solution.

MoralCode commented 1 year ago

so the plan is to essentially go with the frontend solution? sounds good!

are you able to summarize what changes you made on your branch? the latest commit that i see had a +/- 800-line diff that seemed to be mostly whitespace/indentation changes that made it hard to see what was actually modified in the commit.

closing because my change is unnecessary and i'll probably just end up building on top of your other branch

Sma-Das commented 1 year ago

I was just renaming the loading state variables - I think the formatter just had a field day with changing white spaces while I was pushing to have the commit done quickly