This is the code used to run the 2024 SaTML LLM CTF. The code was developed from scratch by:
The app is a FastAPI web server, with a MongoDB database,
and a Redis cache. The web server is served by Uvicorn. Everything runs
in Docker and docker compose
.
Note that the platform was developed while the competition was running and the exact specifics were being detailed, so not all design decisions were optimal.
We ran the application on a single Google Cloud VM with 64GB of RAM and 32 vCPUs. This was enough for most of the competition, but the most heated phases were running a bit too slow.
Some potential improvements that could be done to the platform (PRs welcome!) are:
app/schemas
. Currently, there is some redundancy in the schema classes.docker compose
to kubernetes
or something similar for better scalability and reliability.rye
to manage the Python project..env
file with the same content as .env.example
, and change the values as needed..env.prod
file with the same content as .env.example
, and change the values as needed..secrets
folder with the same content as secrets.example
, and change the values as instructed in each file.docker compose --env-file .env.prod -f compose.prod.yml up --build -d
or
docker compose --env-file .env.prod -f compose.prod.yml up --build -d web
To only start the web service container. If the container(s) are already running, then they will be re-built and re-started.
docker compose up --build -d web
docker compose --env-file .env.prod -f compose.prod.yml up --build -d web
docker compose down
Use web
if you want to start the app, otherwise don't specify to start everything
docker compose logs -f
The -f
flag behaves like in cat
.
Lint with
ruff check --fix .
Format with
ruff format .
Check types with
mypy .