Eagles-DevOps / MiniTwit

2 stars 1 forks source link

Add Sonarqube to the project #211

Closed rasmus-bn closed 5 months ago

Romes8 commented 6 months ago

Manage run the sonerQube with a rough initial local setup:

Screenshot from the sonarqube working: image

TODO:

Setup: SonarQube needs its own database to work as intended,

  1. Added a SonarQube image docker compose file - local one for now. Using the same user as for minitwitdb

  2. Run docker-compose up - ./run-locally.sh

  3. When images start sonarqube won't run. This is because we don't have a database created for Sonarqube yet.

    • error can occur here regarding vm.max_map_count . to solve this: sudo sysctl -w vm.max_map_count=262144
  4. Open DBeaver and manually create a database called "sonar"

  5. Restart the sonarqube image manually. This will pick up the database and work.

  6. Verify if the sonarqube is alive by going to - localhost:9000

  7. Create a new custom project in sonerqube and follow the guide there. For the scanner option choose local development.

  8. Follow the guide for local set up

  9. Since we don't have any Sonarqube in production we need to trigger the scanner locally.

  10. Once configured you will be given a command similar to this one: sonar-scanner.bat -D"sonar.projectKey=PROJECT_NAME" -D"sonar.sources=." -D"sonar.host.url=http://localhost:9000" -D"sonar.token=TOKEN"

  11. Execute this code from CMD from the Minitwit root folder.

  12. Scan should start and sonarQube will do its thing.

  13. Once finished you will be able to see the results in the sonarqube UI.

This is a rough and initial way how to make it work. Once we deploy sonarqube image to production we will trigger the scanner with GithubAction pipeline.

Data are not being saved to volumes for now so SonarQube won't persist the setup.

danielgron commented 5 months ago

Whether it actually makes sense to run it locally I'm not sure, but the way to make it work locally without the manual steps in 3, 4, 5 would be to make an init script for the postgres container.

https://hub.docker.com/_/postgres/

This one looks neat, but haven't tried it: https://dev.to/nietzscheson/multiples-postgres-databases-in-one-service-with-docker-compose-4fdf

Romes8 commented 5 months ago

Whether it actually makes sense to run it locally I'm not sure, but the way to make it work locally without the manual steps in 3, 4, 5 would be to make an init script for the postgres container.

https://hub.docker.com/_/postgres/

This one looks neat, but haven't tried it: https://dev.to/nietzscheson/multiples-postgres-databases-in-one-service-with-docker-compose-4fdf

I was testing it locally just to see how it works and what we need to make it run. When it comes to a new database I think its enough if we update our terraform file for database where we add a new section:

resource "digitalocean_database_db" "database-example" { cluster_id = digitalocean_database_cluster.postgres-cluster.id name = "sonar" } I may be wrong but that's my idea. It would be nice if we can discuss this in person on Friday and make it work

Romes8 commented 5 months ago

Solved by using Sonercloud. Everything was done there. No need to run locally and do stuff.