ClearcodeHQ / pytest-postgresql

This is a pytest plugin, that enables you to test your code that relies on a running PostgreSQL Database. It allows you to specify fixtures for PostgreSQL process and client.
https://pypi.python.org/pypi/pytest-postgresql/
GNU Lesser General Public License v3.0
426 stars 44 forks source link

`postgresql_proc` is not working as expected. #895

Open Modjular opened 8 months ago

Modjular commented 8 months ago

What action do you want to perform

I want to:

  1. Start up my engine once per session
  2. Create, then rollback, an ORM session, once per function
    
    import pytest
    from sqlmodel import Session, create_engine

@pytest.fixture(scope="session") def engine(postgresql_proc): """Create a test engine for the session""" URL = f'postgresql+psycopg://{postgresql_proc.user}:@{postgresql_proc.host}:{postgresql_proc.port}/{postgresql_proc.dbname}'

engine = create_engine(URL)

yield engine

engine.dispose()

@pytest.fixture def session(engine): """Create a test session for each function."""

with Session(bind=engine) as session:
    yield session
    session.rollback()

### What are the results

sqlalchemy.exc.OperationalError: (psycopg.OperationalError) connection failed: FATAL: database "tests" does not exist



### What are the expected results
No error
fizyk commented 8 months ago

@Modjular the test database isn't created directly by postgresql_proc, but bu postgresql fixture.

Instead of following the rollback approach every test, pytest-postgresql utilises database templates approach.

Try following these steps: https://github.com/ClearcodeHQ/pytest-postgresql?tab=readme-ov-file#using-sqlalchemy-to-initialise-basic-database-state I'll code it in eventually, as a working example to run on CI here - this is tested on my other project as poc, see here https://github.com/fizyk/pyramid_fullauth/pull/721/files

Modjular commented 8 months ago

@fizyk Thanks for the response.

Do you have any insight on the performance impacts of running it this way? My test suite (30 small unittests) used to run in under 500ms, but after some changes (fixtures, scoping, psycopg2 to psycopg3) I've noticed my tests take around 4000ms now.

It's definitely possible that any of the other factors are to blame, but am I crazy for wondering if it should be faster?

phillipuniverse commented 8 months ago

@Modjular I ran into the same surprise as you and wanted the db initialization/teardown to happen at the session level not the function level. It was pretty easy to make it work with the DatabaseJanitor:

@pytest.fixture(scope="session")
def postgres_db(postgresql_proc: PostgreSQLExecutor) -> Iterator[PostgreSQLExecutor]:
    with DatabaseJanitor(
        postgresql_proc.user,
        postgresql_proc.host,
        postgresql_proc.port,
        postgresql_proc.dbname,
        postgresql_proc.version,
        postgresql_proc.password,
    ):
        yield postgresql_proc

Then inject postgres_db instead of postgresql_proc where you need it.

IMO a really nice enhancement would be to be able to specify the scope of the postgresql fixture via the factory, defaulting to function like:

postgresql_proc = factories.postgresql_proc(load=[load_database])

postgresql = factories.postgresql('postgresql_proc', scope="session") # scope defaults to "function"
phillipuniverse commented 8 months ago

FYI I added https://github.com/ClearcodeHQ/pytest-postgresql/pull/897 to address the capability to have a session-scoped postgresql fixture to remove the need for the explicit DatabaseJanitor context manager.

fizyk commented 8 months ago

@Modjular I did some performance checks when I introduced and the performance of the templating database should be on par with transaction rollback. With additional plus here. For the transaction to work, you need same session/connection to be used in your code to actually work. And you can not test the full transaction/rollback flow in your test. With template database, you have to separate connections/sessions and you can test the full transaction/rollback flow.