Closed joamatab closed 2 years ago
@joamatab thanks for opening an issue here for these questions.
DataBaseModel
s are a sub class of pydantic BaseModel
and thus share the same constraints that you might face when creating a model. Fortunately custom classes and subsequently pandas DataFrame
type objects can be used combined with the Config
class attribute and arbitrary_types_allowed = True
as documented here
from pandas import DataFrame
from pydbantic import DataBaseModel, PrimaryKey
class MyModel(DataBaseModel): id: str = PrimaryKey() frame: DataFrame
class Config:
arbitrary_types_allowed = True
In future releases I am considering applying a default `arbitrary_types_allowed` to `True`, to avoid the need to set this yourself. Since `DataFrame` type objects are serializable via `pickle`, they will be stored as bytes in the database within a Binary Field in the database.
- Q: How about metadata where the metadata has some JSON fields?
- A : If you expect the metadata fields to change or vary, then set the field type to `dict`, if it will always follow a set schema, consider creating a `BaseModel` for the metadata, both will be stored on Binary database fields ( of course de-serialized / serialized for you).
@joamatab - anything else on this issue?
Thank you Josh for your work with pydbantic,
I want to store both data and metadata for my data
How do you recommend creating tables that can store pandas dataframes with arbitrary columns?
How about metadata where the metadata has some JSON fields?