luigiurbano / Reinforced-Wavsep

A reinforced version of the Wavsep evaluation platform.
GNU General Public License v3.0
17 stars 8 forks source link

Publish Docker Image #1

Closed psiinon closed 1 year ago

psiinon commented 1 year ago

Publishing rwavsep to dockerhub would make it much easier to use. FYI I published the old wavsep image here: https://hub.docker.com/r/owaspvwad/wavsep/

FYI we run ZAP against a range of vulnerable apps and publish the results: https://www.zaproxy.org/docs/scans/

We used to run ZAP against wavsep but had stopped after it was not maintained for so long. I'd love to start using it again but it will take a bit of work on our side. So some questions for you:

Thanks for reanimating this project and hope you keep improving it!

giper45 commented 1 year ago

Hi @psiinon, thank you for your attention, I really like the ZAP scanner!
We will publis it on Dockerhub when all the test cases will be completed.

I try to answer all the questions.

  1. Analyze the OWASP benchmark and merge the two projects.
    As the OWASP Benchmark offers a lot of useful utilities, such as the crawler (that we also created in the utils folder) and the scorecard, we would like to explore the possibility of integrating the two benchmark platforms.

  2. Create a microservices-based evaluation platform.
    Wavsep and OWASP still have a lot of limitations, some of which are shown on the OWASP benchmark page:

    • All vulnerability types in the OWASP Top 10
    • Does the tool find flaws in libraries?
    • Does the tool find flaws spanning custom code and libraries?
    • Does the tool handle web services? REST, XML, GWT, etc?
    • Does the tool work with different app servers? Java platforms?
    • JSPs
    • More popular frameworks
    • Inversion of control
      • Reflection
    • Class loading
    • Annotations
    • Popular UI technologies (e.g., JavaScript frameworks)
    • Entirely new languages (C#, Python, etc.)

Our final idea would be to formalize the benchmark framework design and create a multi-platform platform that includes several languages, several databases and, eventually Windows and Linux operating systems.
The idea would be to leverage the container-based flexibility and capabilities to create a multi-service stack that covers all the test cases.

All the OWASP features, such as the scorecard, parsing, etc., will be integrated into the platform. A management web application will allow the generation of a test suite and the user/researcher could test its scanners and methodologies.

Those are challenging tasks and will require a lot of effort. If you know some company or person interested in these ideas and that could contribute in some way, please let me know.
Regards.

psiinon commented 1 year ago

Thank you for such a detailed reply - and it all sounds great! I love the idea of getting students to contribute test cases - its a great learning opportunity for them. We'll try to get ZAP to test this daily asap, but for that we would need the docker image on docker hub. Is there any reason you wanted to complete the test cases first? If you can publish one now then we can test it with ZAP and give you feedback. Failing that we'll have to publish it ourselves, which might take us longer 😉

giper45 commented 1 year ago

Hi @psiinon I have just pushed it ;-)

I wanted to check for CMDi and XXE tests before publishing, but if you can test it I will proceed to fix them!

psiinon commented 1 year ago

Many thanks!

giper45 commented 1 year ago

Hi @psiinon, I have implemented a feature for the OWASP BenchmarkUtils project that allows to use the scorecards against other benchmarks, as Wavsep. To use it:

psiinon commented 1 year ago

Hey @giper45, that sounds interesting. Last time I checked the Benchmark scoring was not suitable for us to use. Uploading a results file and getting a report doesnt work for us. It would be much better to have a file while told us which vulns were present on which URLs - we could then check our own results and update out metrics that way. Happy to explain in more detail if you want...

giper45 commented 1 year ago

Hey @psiinon we use har files to implement the crawler and generate a csv file compliant with the OWASP Benchmark. I start from the urls and generate test case names, so I think that your needed file can be easily implemented. If you want to give me more details, you can reach me on Telegram .