SasanLabs / VulnerableApp

OWASP VulnerableApp Project: For Security Enthusiasts by Security Enthusiasts.
https://sasanlabs.github.io/VulnerableApp/
Apache License 2.0
298 stars 395 forks source link

Adding new level in Unrestricted File upload which doesn't have a check on size of file uploaded #351

Open preetkaran20 opened 2 years ago

preetkaran20 commented 2 years ago

Is your feature request related to a problem? Please describe. Currently there is no level in Unrestricted File Upload which doesn't have a check on size of file uploaded.

Describe the solution you'd like Adding a new level in Unrestricted File Upload with no size restriction such that it can cause DDOS in the vulnerable app. Springboot has a default limit so we need to disable that limit using the following link: https://stackoverflow.com/questions/40271484/spring-upload-file-size-limit. If we disable using application.properties then it will be applicable to all the levels but we don't want to disable to all the levels.

Adding new level is very easy, you just copy a level from above and then add the details to it. For information on what does each annotation/property does, look at: https://github.com/SasanLabs/VulnerableApp/blob/master/src/main/resources/sampleVulnerability/sampleVulnerability/SampleVulnerability.java#L86 javadocs and also https://sasanlabs.github.io/VulnerableApp/DesignDocumentation.html#design.

preetkaran20 commented 2 years ago

if it is complex to implement in springboot then i would suggest to add this in PHP as it is quite simple. Have a look at: https://github.com/SasanLabs/VulnerableApp-php

PHP issue: https://github.com/SasanLabs/VulnerableApp-php/issues/14

ehizman commented 2 years ago

I will like to work on this also

preetkaran20 commented 2 years ago

@ehizman please look at https://github.com/SasanLabs/VulnerableApp/issues/350 for more information.

ehizman commented 2 years ago

Noted @preetkaran20

tkomlodi commented 1 year ago

@preetkaran20, could you assign this to me? I think I have a relatively clean solution.

In short, I'd overwrite the spring MultipartFilter bean to provide a custom MultipartResolver implementation for a specific controller path only. All other paths would keep using the default spring resolver.

Besides this, it would only need a new apache dependency to provide the basis for the custom implementation: implementation group: 'commons-fileupload', name: 'commons-fileupload', version: '1.5'.

The bean definition would be something like this:

@Bean
@Order(0)
public MultipartFilter multipartFilter() {
    class MF extends MultipartFilter {
        @Override
        protected MultipartResolver lookupMultipartResolver(HttpServletRequest request) {
            if("/UnrestrictedFileUpload/LEVEL_10".equals(request.getServletPath())) {
                CommonsMultipartResolver multipart = new CommonsMultipartResolver();
                multipart.setMaxUploadSize(-1);
                multipart.setMaxUploadSizePerFile(-1);
                return multipart;
            } else {
                return lookupMultipartResolver();
            }
        }
    };

    return new MF();
}

And of course, we would need a new vulnerable endpoint for "/UnrestrictedFileUpload/LEVEL_10".

I quickly tested this and it applies the default file size limits to all endpoints except the one specified above.

(The above code is only a prototype, not the proposed final solution.)

Thanks!

preetkaran20 commented 1 year ago

@tkomlodi Thanks for finding the solution. Assigning the ticket to you.

tkomlodi commented 1 year ago

@preetkaran20, I have a question about the approach for the vulnerable endpoint. I implemented it so it accepts files with unlimited sizes and stores them in an in-memory list. This will cause the heap to run out of memory and start throwing OutOfMemoryErrors when sufficient large files have been uploaded. I went this way since saving the files to disk could cause serious issues to the system that hosts the application. E.g., I am not running the system in a container and don't want to cause my laptop to run out of disk space. I assume this could cause issues for others too. The reason I'm accumulating the files in memory is to make it easier to cause an out of memory situation. If the file(s) are not stored, only the stack memory space would run out and may require a very large file to make it happen. The downside to accumulating the files is that the application will require a restart to recover.

I'm not sure how scanners decide if the size of an uploaded file is unlimited, or just set to be very large. So it is possible that just simply allowing uploading unlimited sizes would be sufficient to trigger an alert, and there is no need to store them at all.

Let me know what you think. Thanks