When logged in as an instructor and viewing an open rapid response problem in LMS, our app polls the /responses endpoint every 3 seconds to get updated information about problem responses. While running load tests on rapid response problems (#16), it was discovered that server can take a long time to serve requests. This is a major UX problem as it can result in requests that run for a very long time (I observed times from 20s-1m) and completely lock up the UI in the meantime. We should change these requests to be asynchronous so the UI can at least remain interactive and the clock can continue ticking down. We should also set sensible timeouts for those requests
When logged in as an instructor and viewing an open rapid response problem in LMS, our app polls the
/responses
endpoint every 3 seconds to get updated information about problem responses. While running load tests on rapid response problems (#16), it was discovered that server can take a long time to serve requests. This is a major UX problem as it can result in requests that run for a very long time (I observed times from 20s-1m) and completely lock up the UI in the meantime. We should change these requests to be asynchronous so the UI can at least remain interactive and the clock can continue ticking down. We should also set sensible timeouts for those requests