pocketbase / benchmarks

PocketBase test application with various benchmarks
MIT License
39 stars 5 forks source link
benchmarks pocketbase

PocketBase benchmarks

This is a test PocketBase application with various benchmarks to serve as a general idea of what you could expect from a basic PocketBase instance without extra performance tuning or deployment on expensive server.

Results

Keep in mind that the tests are performed on the same machine where the PocketBase instance is deployed so the app performance can be slightly affected by the benchmarks execution itself (most hosts providers have several protections in place and at the moment I don't have the time to create proper setup to run the tests from more than one IP).

There are several optimizations that can be done and the benchmarks will change in the future so that the tests can run as part of the development workflow to track regressions, but for now improving the overall PocketBase dev experience remains with higher priority.

Takeaways and things we'll have to improve

About the benchmarks

The application uses the develop branch of PocketBase.

The test database is ~180MB and has the following collections structure:

db_erd

In order to emulate real usage scenarios as close as possible, the tests are grouped in several categories:

Run the benchmarks

To run the benchmarks locally or on your server, you can:

  1. Install Go 1.18+ (if you haven't already)
  2. Clone/download the repo
  3. Run GOOS=linux GOARCH=amd64 CGO_ENABLED=0 go build (https://go.dev/doc/install/source#environment)
  4. Start the created executable by running ./app serve.
  5. Navigate to http:localhost:8090/benchmarks

By default all tests are executed in the previously mentioned test categories order, but you can also specify which exact tests to run using the /benchmarks?run=custom,delete query parameter.

The above will start the benchmarks in a new goroutine and once completed it will print its result to the terminal and in the benchmarks collection (note that some of the tests are slow and it may take some time to complete; we write the test result as a collection record to workaround various host providers DDoS protections and restrictions like persistent connections limit, read/write timeouts, etc.).