Closed franz101 closed 4 years ago
My bad connected with mysql to Postgres...
@franz101 No problem! you feel free to contact me when you have any questions :nerd_face: thank you for closing the issue.
Thanks for your kind reply @koxudaxi This time I get a casting error for Postgres when inserting a file with JSONB I attached a possible solution at the bottom.
from sqlalchemy.dialects.postgresql.json import JSONB from sqlalchemy.dialects.postgresql import insert as pg_insert
player=PJsonObj(PK=player['id'], SK=player['SK'], data=player, updated=str(datetime.datetime.now()))
table = Table( "test_table", MetaData(), Column('PK',String, nullable=False), Column('SK',String, nullable=False), Column("data",JSONB), Column('updated',Date, nullable=True), PrimaryKeyConstraint("PK","SK") )
engine.execute(pg_insert(table,player.as_dict())
May be related to this: https://github.com/chanzuckerberg/aurora-data-api/issues/3
@franz101 Thank you for posting the problem and workaround. I have created a new issue for this problem. I will resolve it.
Creating a table through SQLAlchemy with the following code: ` meta = MetaData()
Table( "pets", meta, Column("id",Integer, primary_key=True, autoincrement=True), Column("name",String(255), default=None) )
meta.create_all(engine) `
I get the give error....
Also with:
Pets.__table__.create(engine)
Is not implemented yet?