We've had an instructor with a very large class encounter a django error Request body exceeded settings.DATA_UPLOAD_MAX_MEMORY_SIZE.
While that setting can be increased, it's deliberately low: an attacker could perform a denial of service by sending extremely large requests.
The request to save re-marked attempts should be broken up into smaller chunks, instead of sending all the changed data together. The simplest way of doing this would be to make one request for each changed attempt, but that would be slow. Would it be feasible to use the value of the DATA_UPLOAD_MAX_MEMORY_SIZE setting to decide how much data to send in each request?
We've had an instructor with a very large class encounter a django error
Request body exceeded settings.DATA_UPLOAD_MAX_MEMORY_SIZE
.While that setting can be increased, it's deliberately low: an attacker could perform a denial of service by sending extremely large requests.
The request to save re-marked attempts should be broken up into smaller chunks, instead of sending all the changed data together. The simplest way of doing this would be to make one request for each changed attempt, but that would be slow. Would it be feasible to use the value of the
DATA_UPLOAD_MAX_MEMORY_SIZE
setting to decide how much data to send in each request?