digdir / dialogporten

Dialogporten - common API and and metadata state store for digital dialogs
https://docs.altinn.studio/dialogporten
MIT License
1 stars 3 forks source link

Dialogporten

Getting started with local development

Mac

Prerequisites

Installing Podman (Mac)

  1. Install Podman

  2. Install dependencies:

    brew tap cfergeau/crc
    # https://github.com/containers/podman/issues/21064
    brew install vfkit
    brew install docker-compose
  3. Restart your Mac

  4. Finish setup in Podman Desktop

  5. Check that Docker Compatility mode is enabled, see the bottom left corner

  6. Enable privileged testcontainers-dotnet
    echo "ryuk.container.privileged = true" >> $HOME/.testcontainers.properties

Windows

Prerequisites

Installing Podman (Windows)

  1. Install Podman Desktop. .

  2. Start Podman Desktop and follow instructions to install Podman.

  3. Follow instructions in Podman Desktop to create and start a Podman machine.

  4. In Podman Desktop, go to Settings → Resources and run setup for the Compose Extension. This will install docker-compose.

Running the project

You can run the entire project locally using podman compose. (This uses docker-compose behind the scenes.)

podman compose up

The following GUI services should now be available:

The WebAPI and GraphQl services are behind a nginx proxy, and you can change the number of replicas by setting the scale property in the docker-compose.yml file.

Running the WebApi/GraphQl in an IDE

If you need do debug the WebApi/GraphQl projects in an IDE, you can alternatively run podman compose without the WebAPI/GraphQl.
First, create a dotnet user secret for the DB connection string.

dotnet user-secrets set -p .\src\Digdir.Domain.Dialogporten.WebApi\ "Infrastructure:DialogDbConnectionString" "Server=localhost;Port=5432;Database=Dialogporten;User ID=postgres;Password=supersecret;Include Error Detail=True;"

Then run podman compose without the WebAPI/GraphQl projects.

podman compose -f docker-compose-no-webapi.yml up 

DB development

This project uses Entity Framework core to manage DB migrations. DB development can either be done through Visual Studios Package Manager Console (PMC) or through the CLI.

DB development through PMC

Set Digdir.Domain.Dialogporten.Infrastructure as the startup project in Visual Studio's solution explorer, and as the default project in PMC. You are now ready to use EF core tools through PMC. Run the following command for more information:

Get-Help about_EntityFrameworkCore

DB development through CLI

Install the CLI tool with the following command:

dotnet tool install --global dotnet-ef

You are now ready to use EF core tools through CLI. Run the following command for more information:

dotnet ef --help

Remember to target Digdir.Domain.Dialogporten.Infrastructure project when running the CLI commands. Either target it through the command using the -p option, i.e.

dotnet ef migrations add -p .\src\Digdir.Domain.Dialogporten.Infrastructure\ TestMigration

Or change your directory to the infrastructure project and then run the command.

cd .\src\Digdir.Domain.Dialogporten.Infrastructure\
dotnet ef migrations add TestMigration

Testing

Besides ordinary unit and integration tests, there are test suites for both functional and non-functional end-to-end tests implemented with K6.

See tests/k6/README.md for more information.

Updating the SDK in global.json

When RenovateBot updates global.json or base image versions in Dockerfiles, make sure they match. The global.json file should always have the same SDK version as the base image in the Dockerfiles. This is to ensure that the SDK version used in the local development environment matches the SDK version used in the CI/CD pipeline. global.json is used when building the solution in CI/CD.

Development in local and test environments

To generate test tokens, see https://github.com/Altinn/AltinnTestTools. There is a request in the Postman collection for this.

Local development settings

We are able to toggle some external resources in local development. This is done through the appsettings.Development.json file. The following settings are available:

"LocalDevelopment": {
  "UseLocalDevelopmentUser": true,
  "UseLocalDevelopmentResourceRegister": true,
  "UseLocalDevelopmentOrganizationRegister": true,
  "UseLocalDevelopmentNameRegister": true,
  "UseLocalDevelopmentAltinnAuthorization": true,
  "UseLocalDevelopmentCloudEventBus": true,
  "UseLocalDevelopmentCompactJwsGenerator": true,
  "DisableShortCircuitOutboxDispatcher": true,
  "DisableCache": false,
  "DisableAuth": true
}

Toggling these flags will enable/disable the external resources. The DisableAuth flag, for example, will disable authentication in the WebAPI project. This is useful when debugging the WebAPI project in an IDE. These settings will only be respected in the Development environment.

Using appsettings.local.json

During local development, it is natural to tweak configurations. Some of these configurations are meant to be shared through git, such as the endpoint for a new integration that may be used during local development. Other configurations are only meant for a specific debug session or a developer's personal preferences, which should not be shared through git, such as lowering the log level below warning.

The configuration in the appsettings.local.json file takes precedence over all other configurations and is only loaded in the Development environment. Additionally, it is ignored by git through the .gitignore file.

If developers need to add configuration that should be shared, they should use appsettings.Development.json. If the configuration is not meant to be shared, they can create an appsettings.local.json file to override the desired settings.

Here is an example of enabling debug logging only locally:

// appsettings.local.json
{
    "Serilog": {
        "WriteTo": [
            {
                "Name": "Console",
                "Args": {
                    "outputTemplate": "[{Timestamp:HH:mm:ss.fff} {Level:u3}] {Message:lj}{NewLine}{Exception}"
                }
            }
        ],
        "MinimumLevel": {
            "Default": "Debug"
        }
    }
}

Adding appsettings.local.json to new projects

Add the following to the Program.cs file to load the appsettings.local.json file:


Example usage:
```csharp
var builder = WebApplication.CreateBuilder(args);
// or var builder = CoconaApp.CreateBuilder(args);
// or var builder = Host.CreateApplicationBuilder(args);
// or some other builder implementing IHostApplicationBuilder

// Left out for brevity
builder.Configuration
    // Add local configuration as the last configuration source to override other configurations
    //.AddSomeOtherConfiguration()
    .AddLocalConfiguration(builder.Environment);

// Left out for brevity

Pull requests

For pull requests, the title must follow Conventional Commits. The title of the PR will be used as the commit message when squashing/merging the pull request, and the body of the PR will be used as the description.

This title will be used to generate the changelog (using Release Please) Using fix will add to "Bug Fixes", feat will add to "Features". All the others,chore, ci, etc., will be ignored. (Example release)

Deployment

This repository contains code for both infrastructure and applications. Configurations for infrastructure are located in .azure/infrastructure. Application configuration is in .azure/applications.

Deployment process

Deployments are done using GitHub Actions with the following steps:

1. Create and Merge Pull Request

2. Build and Deploy to Test

3. Prepare Release for Staging

4. Deploy to Staging (Bump Version and Create Tag)

5. Prepare deployment to Production

6. Deploy to Production

Visual Workflow

Deployment process

Release Please is used to create releases, generate changelog and bumping version numbers.

CHANGELOG.md and version.txt are automatically updated and should not be changed manually.

Manual deployment (⚠️ handle with care)

This project uses two GitHub dispatch workflows to manage manual deployments: dispatch-apps.yml and dispatch-infrastructure.yml. These workflows allow for manual triggers of deployments through GitHub Actions, providing flexibility for deploying specific versions to designated environments.

Using dispatch-apps.yml

The dispatch-apps.yml workflow is responsible for deploying applications. To trigger this workflow:

  1. Navigate to the Actions tab in the GitHub repository.
  2. Select the Dispatch Apps workflow.
  3. Click on "Run workflow".
  4. Fill in the required inputs:
    • environment: Choose the target environment (test, staging, or prod).
    • version: Specify the version to deploy. Could be git tag or a docker-tag published in packages.
    • runMigration (optional): Indicate whether to run database migrations (true or false).

This workflow will handle the deployment of applications based on the specified parameters, ensuring that the correct version is deployed to the chosen environment.

Using dispatch-infrastructure.yml

The dispatch-infrastructure.yml workflow is used for deploying infrastructure components. To use this workflow:

  1. Go to the Actions tab in the GitHub repository.
  2. Select the Dispatch Infrastructure workflow.
  3. Click on "Run workflow".
  4. Provide the necessary inputs:
    • environment: Select the environment you wish to deploy to (test, staging, or prod).
    • version: Enter the version to deploy, which should correspond to a git tag.

This workflow facilitates the deployment of infrastructure to the specified environment, using the version details provided.

GitHub Actions

Naming conventions for GitHub Actions:

The action-check-for-changes.yml workflow uses the tj-actions/changed-files action to check which files have been altered since last commit or tag. We use this filter to ensure we only deploy backend code or infrastructure if the respective files have been altered.

Infrastructure

Infrastructure definitions for the project are located in the .azure/infrastructure folder. To add new infrastructure components, follow the existing pattern found within this directory. This involves creating new Bicep files or modifying existing ones to define the necessary infrastructure resources.

For example, to add a new storage account, you would:

Refer to the existing infrastructure definitions as templates for creating new components.

Deploying a new infrastructure environment

A few resources need to be created before we can apply the Bicep to create the main resources.

The resources refer to a source key vault in order to fetch the necessary secrets and store them in the key vault for the environment. An ssh-key is also necessary for the ssh-jumper used to access the resources in Azure within the vnet.

Use the following steps:

Connecting to resources in Azure

There is a ssh-jumper virtual machine deployed with the infrastructure. This can be used to create a ssh-tunnel into the vnet. Use one of the following methods to gain access to resources within the vnet:

Ensure you log into the azure CLI using the relevant user and subscription using az login.

Applications

All application Bicep definitions are located in the .azure/applications folder. To add a new application, follow the existing pattern found within this directory. This involves creating a new folder for your application under .azure/applications and adding the necessary Bicep files (main.bicep and environment-specific parameter files, e.g., test.bicepparam, staging.bicepparam).

For example, to add a new application named web-api-new, you would:

Refer to the existing applications like web-api-so and web-api-eu as templates.

Deploying applications in a new infrastructure environment

Ensure you have followed the steps in Deploying a new infrastructure environment to have the resources required for the applications.

Use the following steps: