Open WilliamDEdwards opened 8 months ago
If my understanding is correct, you want a way to convert pydantic models into sqla models? Could you show an example of what it is that you're trying to achieve?
If my understanding is correct, you want a way to convert pydantic models into sqla models? Could you show an example of what it is that you're trying to achieve?
I would like to use PydanticFactory
, then save the corresponding SQLAlchemy model to the database. According to the documentation, this can be achieved by adding a persistence handler to PydanticFactory
. I would then convert and save in the save
method.
Aah gotcha. And how is that translation from pydantic model to SQL model happening?
Aah gotcha. And how is that translation from pydantic model to SQL model happening?
For SQLAlchemy to Pydantic: https://docs.pydantic.dev/latest/concepts/models/#arbitrary-class-instances
The other way around, which is what I'm looking for:
from pydantic import BaseModel, ConfigDict
from sqlalchemy import create_engine
from sqlalchemy.orm import Mapped, mapped_column
from sqlalchemy import Integer
from sqlalchemy.orm import declarative_base, sessionmaker
engine = create_engine("mysql+pymysql://username:password@host/database")
database_session = sessionmaker(bind=engine)()
Base = declarative_base()
class TableOrm(Base):
__tablename__ = "table"
id: Mapped[int] = mapped_column(Integer, primary_key=True)
class TablePydantic(BaseModel):
model_config = ConfigDict(from_attributes=True)
id: int
pydantic_schema = TablePydantic(id=1)
pydantic_schema_as_dict = pydantic_schema.model_dump(mode="json") # mode ensures only JSON-serializable types
sqlalchemy_object = TableOrm(**pydantic_schema_as_dict)
database_session.add(sqlalchemy_object)
database_session.commit()
database_session.refresh(sqlalchemy_object)
Hm...I'm not a 100% sure on whether we should add this to the documentation. That being said, I do see the benefit of this and agree that it may be a relatively common use case (especially for those using FastAPI like you said). So I'm kind of 50/50 on this. Anyone have any opinions on this, @litestar-org/members?
Summary
Hi,
I use FastAPI with a 'typical' structure: Pydantic schemas are converted to SQLAlchemy models.
For tests, currently, SQLAlchemy models are created in fixtures. Obviously, as the amount of models and validations grows, this doesn't scale. Therefore, I will use per-test objects created by factories.
polyfactory looks ideal, but one thing is missing: I would like to pass a Pydantic schema (
PydanticFactory
), then convert it to an SQLAlchemy model and create it, using a custom (CRUD layer) implementation.Reason: Pydantic schemas validate business logic that should not be violated in tests (which happens by creating SQLAlchemy models with only data validation).
This looks relatively easy to implement by passing a custom handler to
__sync_persistence__
on a factory (https://polyfactory.litestar.dev/latest/usage/configuration.html#persistence-handlers). (Still need to figure out how to subclass my factories from a base class so I don't have to repeat that...)Instead of writing the implementation myself, would the maintainers be interested in adding a first-party supported example to the documentation? I could imagine this use case is quite common, as this is relevant for basically all larger FastAPI/Pydantic/SQLAlchemy projects.