Closed sheraff0 closed 1 year ago
So the question is, how to make result_processor
, defined with TypeDecorator
, work.
I'm getting (ST_AsGeoJSON(ST_FlipCoordinates({col}))::jsonb->>'coordinates')::json
tuple (array), from column_expression
i.e.
[48, 25]
expecting processed result, from result_processor
:
{"lat": 48, "lng": 25}
Hi @sheraff0 As far as I can see, there are several issues in your example:
column_expression
method should return an SQL expression, not a text clause.bind_processor
but you should define a process_bind_param
method instead.result_processor
but you should define a process_result_value
method instead.All this put together would transform your example into something like this (might need a few adjustments):
import json
from sqlalchemy import Column
from sqlalchemy import create_engine
from sqlalchemy.orm import sessionmaker
from sqlalchemy import Integer
from sqlalchemy import MetaData
from sqlalchemy import func
from sqlalchemy.orm import declarative_base
from sqlalchemy.types import TypeDecorator
from geoalchemy2 import Geometry
db_connection_url = 'postgresql://gis:gis@localhost/gis'
engine = create_engine(db_connection_url, echo=True)
engine.update_execution_options(search_path=["gis", "public"])
metadata = MetaData()
session = sessionmaker(bind=engine)()
Base = declarative_base(metadata=metadata)
conn = session.connection()
class PointFieldType(TypeDecorator):
impl = Geometry
cache_ok = True
def column_expression(self, col):
return getattr(func, "ST_AsGeoJSON")(
getattr(func, "ST_FlipCoordinates")(
col,
type_=self,
),
type_=self,
)
def process_bind_param(self, value, dialect):
coords_tuple = [*value.values()][::-1]
return (
f"SRID={self.impl.srid};"
f"POINT({' '.join(map(str, coords_tuple))})"
)
def process_result_value(self, value, dialect):
val = json.loads(value.data)
lat, lng = val["coordinates"]
return Coords(lat=lat, lng=lng)
class PointField(Base):
__tablename__ = "pointfield"
id = Column(Integer, primary_key=True)
raw_geom = Column(Geometry(srid=4326, geometry_type="POINT"))
geom = Column(PointFieldType(srid=4326, geometry_type="POINT"))
metadata.drop_all(conn, checkfirst=True)
metadata.create_all(conn)
# Create new point instance
p = PointField()
p.raw_geom = "SRID=4326;POINT(5 45)"
p.geom = {"lat": 5, "lon": 45}
# Insert point
session.add(p)
session.flush()
session.expire(p)
# Query point
pt = session.query(PointField).one()
Right, process_bind_param
works as it should be.
process_result_value
is invoked ONLY if column_expression
is overridden in any way, and ignored otherwise.
With hidden
...
def _column_expression(self, col):
the app gets raw SQL results:
... 'raw_geom': <WKBElement>, 'geom': <WKBElement>...
whatever you put into process_result_value
... Something I can't explain reading the source code.
Thanks for the answer, it's been helpful!
Hmmm ok, I didn't know these details :) But now it works, right?
I mean, it's not an easy guess, that process_result_value
doesn't work apart from column_expression
.
Yeah, I'm ok with it!
Yeah indeed, the doc of SQLAlchemy is not very clear about this. Anyway, glad it helped :)
Hi geoalchemists!
I'm using
TypeDecorator
as a DRY approach to mapping Google Coordinates to PostGIS WKT, and back.Applying
bind_processor
to incoming data, as it's cleaner and better to do with Python's comprehensions; SQLbind_expression
, as suggested in example is supposedly for more complicated cases.Querying database with
column_expression
- works nicely.Trying to make a final touch on resulting values with
result_processor
- seems to be ignored (I get raw data fromcolumn_expression
).process_result_value
somehow seems to be unavailable. Browsing through source code unfortunately didn't give the clue how to activate this method forGeometry
-derived types.The example is for a single
Point
, the API is also dealing withPolygon
s andMultiline
s (nested tuples, without keys). Looks smarter to extend ORM's type definitions, rather than API's (pydantic
) schema for each model with geo fields.Retrieving data with
.scalars().all()
like this:(
SQLAlchemy 2.0
,asyncpg2
)Thank you for a wonderful work!