fastapi / sqlmodel

SQL databases in Python, designed for simplicity, compatibility, and robustness.
https://sqlmodel.tiangolo.com/
MIT License
14.34k stars 635 forks source link

How to use two different metadata properly to connect to two databases? #264

Open murata100 opened 2 years ago

murata100 commented 2 years ago

First Check

Commit to Help

Example Code

from typing import Optional

from sqlmodel import Field, SQLModel
import sqlalchemy

metadata1 = sqlalchemy.MetaData()
metadata2 = sqlalchemy.MetaData()

# all_metadata = sqlmodel.SQLModel.metadata

class Hero(SQLModel, table=True):
    id: Optional[int] = Field(default=None, primary_key=True)
    name: str
    secret_name: str
    age: Optional[int] = None
    __table_args__ = (
        metadata1,  # This setting has no effect !! :(
    )

class Boss(SQLModel, table=True):
    id: Optional[int] = Field(default=None, primary_key=True)
    name: str
    age: Optional[int] = None
    __table_args__ = (
        metadata2,  # This setting has no effect !! :(
    )

engine1 = sqlalchemy.create_engine("database 1")
engine2 = sqlalchemy.create_engine("database 2")

metadata1.create_all(engine1)
metadata2.create_all(engine2)

## in alembic's env.py
# target_metadata = {
#    'engine1': mymodel.metadata1,
#    'engine2': mymodel.metadata2
#}

Description

Operating System

Linux

Operating System Details

No response

SQLModel Version

0.0.6

Python Version

3.10.0

Additional Context

No response

byrman commented 2 years ago

This seems to work, although I did not test is thoroughly:

class SQLModel1(SQLModel):
    metadata = MetaData()

class SQLModel2(SQLModel):
    metadata = MetaData()

...

SQLModel1.metadata.create_all(engine1)
SQLModel2.metadata.create_all(engine2)
murata100 commented 2 years ago

Thank you @byrman

I saw the definition of the SQLModel class

metadata is a class variable metadata: ClassVar[MetaData]

Is the only way to handle multiple metadata is to extend the SQLModel class?

class SQLModel(BaseModel, metaclass=SQLModelMetaclass, registry=default_registry):
    # SQLAlchemy needs to set weakref(s), Pydantic will set the other slots values
    __slots__ = ("__weakref__",)
    __tablename__: ClassVar[Union[str, Callable[..., str]]]
    __sqlmodel_relationships__: ClassVar[Dict[str, RelationshipProperty]]  # type: ignore
    __name__: ClassVar[str]
    metadata: ClassVar[MetaData]
...

https://github.com/tiangolo/sqlmodel/blob/8d1b6f079adad47cc242710f6cb1790a8ad8fbd2/sqlmodel/main.py#L478

byrman commented 2 years ago

Is the only way to handle multiple metadata is to extend the SQLModel class?

In my experience there are often more solutions to a problem. I don't even know if extending the class is the right way to do it: I only tested table creation and select. Let us know how it works out for you.

byrman commented 2 years ago

By the way, create_all accepts a list of tables, so you can control which tables are created where by passing different lists to this method. That might also be something for you to try.

murata100 commented 2 years ago

For now, I use it with alembic. I don't use create_all in production.

env.py

from myapp import mymodel
target_metadata = {
      'engine1': mymodel.metadata1,
      'engine2': mymodel.metadata2
}

Probably, There is no problem to use it with alembic.

But, I'm a little hesitant to use sqlmodel in production..

gsouveton commented 2 years ago

I'm facing the same problem. But additionally I would like to work with transactions. SQLAlchemy allows to do this :

Session = sessionmaker(binds={
    SomeMappedClass: create_engine('postgresql://engine1'),
    SomeDeclarativeBase: create_engine('postgresql://engine2'),
    some_mapper: create_engine('postgresql://engine3'),
    some_table: create_engine('postgresql://engine4'),
    })

Which allows to use an unique Session to process transactions. I saw that the SQLModel Session class basically wraps the SQLAlchemy's one. Is it possible to do something like this with only one session for several engines and data models split among these engines as SQLAlchemy allows ?


engine1 = create_engine('postgresql://db1')
engine2 = create_engine('postgresql://db2')

# bind User operations to engine 1, Account operations to engine 2
Session.configure(binds={User:engine1, Account:engine2})
session = Session()
# .... work with accounts and users
session.commit()
cctdev commented 2 years ago

Did anyone else ever figure this out? I'm looking to do something similiar (2 different fastapi endpoints using sqlmodel each tied to a different database) but am struggling to find any documentation that would support doing this.

vn7n24fzkq commented 1 year ago

I finally found a simple way to create tables in different databases.

import uuid as uuid_pkg
from typing import Optional

from sqlalchemy.orm import registry
from sqlmodel import Field, SQLModel, create_engine

class DB1Base(SQLModel, registry=registry()):
    pass

class DB2Base(SQLModel, registry=registry()):
    pass

class DB1Table(DB1Base, table=True):
    uuid: Optional[uuid_pkg.UUID] = Field(default_factory=uuid_pkg.uuid4, primary_key=True)
    name: str = Field(nullable=False, index=True, unique=True)

class DB2Table(DB2Base, table=True):
    uuid: Optional[uuid_pkg.UUID] = Field(default_factory=uuid_pkg.uuid4, primary_key=True)
    name: str = Field(nullable=False, index=True, unique=True)

db1_engine = create_engine("postgresql://postgres@localhost:5432/db1")
db2_engine = create_engine("postgresql://postgres@localhost:5432/db2")

DB1Base.metadata.create_all(db1_engine)
DB2Base.metadata.create_all(db2_engine)
gsouveton commented 1 year ago

What a timing @vn7n24fzkq ! I spent the last few days on this and just came to the same solution !

Here is the explanation : SQLModel uses by default a single registry mapper, which means all classes names are gathered in a single namespace, causing name collisions if several databases contain classes with the same name. Which causes errors like this one :

"sqlalchemy.exc.InvalidRequestError: Multiple classes found for path "xxx" in the registry of this declarative base. Please use a fully module-qualified path."

Specifying a new instance of registry when creating a SQLModel subclass seems the right way to solve this issue.

jooh commented 1 year ago

I finally found a simple way to create tables in different databases.

import uuid as uuid_pkg
from typing import Optional

from sqlalchemy.orm import registry
from sqlmodel import Field, SQLModel, create_engine

class DB1Base(SQLModel, registry=registry()):
    pass

class DB2Base(SQLModel, registry=registry()):
    pass

class DB1Table(DB1Base, table=True):
    uuid: Optional[uuid_pkg.UUID] = Field(default_factory=uuid_pkg.uuid4, primary_key=True)
    name: str = Field(nullable=False, index=True, unique=True)

class DB2Table(DB2Base, table=True):
    uuid: Optional[uuid_pkg.UUID] = Field(default_factory=uuid_pkg.uuid4, primary_key=True)
    name: str = Field(nullable=False, index=True, unique=True)

db1_engine = create_engine("postgresql://postgres@localhost:5432/db1")
db2_engine = create_engine("postgresql://postgres@localhost:5432/db2")

DB1Base.metadata.create_all(db1_engine)
DB2Base.metadata.create_all(db2_engine)

This works!

If like me you were struggling to combine this with using inheritance to extend models the solution is to use mixins to patch in columns as in #70

tanaga9 commented 1 year ago

I want a description of the recommended method in the official document.