simonw / sqlite-utils

Python CLI utility and library for manipulating SQLite databases
https://sqlite-utils.datasette.io
Apache License 2.0
1.65k stars 111 forks source link

Advanced class-based `conversions=` mechanism #402

Open simonw opened 2 years ago

simonw commented 2 years ago

The conversions= parameter works like this at the moment: https://sqlite-utils.datasette.io/en/3.23/python-api.html#converting-column-values-using-sql-functions

db["places"].insert(
    {"name": "Wales", "geometry": wkt},
    conversions={"geometry": "GeomFromText(?, 4326)"},
)

This proposal is to support values in that dictionary that are objects, not strings, which can represent more complex conversions - spun out from #399.

New proposed mechanism:

from sqlite_utils.utils import LongitudeLatitude

db["places"].insert(
    {
        "name": "London",
        "point": (-0.118092, 51.509865)
    },
    conversions={"point": LongitudeLatitude},
)

Here LongitudeLatitude is a magical value which does TWO things: it sets up the GeomFromText(?, 4326) SQL function, and it handles converting the (51.509865, -0.118092) tuple into a POINT({} {}) string.

This would involve a change to the conversions= contract - where it usually expects a SQL string fragment, but it can also take an object which combines that SQL string fragment with a Python conversion function.

Best of all... this resolves the lat, lon v.s. lon, lat dilemma because you can use from sqlite_utils.utils import LongitudeLatitude OR from sqlite_utils.utils import LatitudeLongitude depending on which you prefer!

Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/399#issuecomment-1030739566

simonw commented 2 years ago

From that thread, two extra ideas which it may be possible to support in a single implementation:

from sqlite_utils.conversions import LongitudeLatitude

db["places"].insert(
    {
        "name": "London",
        "lng": -0.118092,
        "lat": 51.509865,
    },
    conversions={"point": LongitudeLatitude("lng", "lat")},
)

And

db["places"].insert(
    {
        "name": "London",
        "point": LongitudeLatitude(-0.118092, 51.509865)
    }
)
simonw commented 2 years ago

So the key idea here is to introduce a new abstract base class, Conversion, which has the following abilities:

simonw commented 2 years ago

I like the idea that the contract for Conversion (or rather for its subclasses) is that it can wrap a Python value and then return both the SQL fragment - e.g. GeomFromText(?, 4326) - and the values that should be used as the SQL parameters.

simonw commented 2 years ago

I think this is the code that needs to become aware of this system: https://github.com/simonw/sqlite-utils/blob/fea8c9bcc509bcae75e99ae8870f520103b9aa58/sqlite_utils/db.py#L2453-L2469

There's an earlier branch that runs for upserts which needs to be modified too: https://github.com/simonw/sqlite-utils/blob/fea8c9bcc509bcae75e99ae8870f520103b9aa58/sqlite_utils/db.py#L2417-L2440

eyeseast commented 2 years ago

I wonder if there's any overlap with the goals here and the sqlite3 module's concept of adapters and converters: https://docs.python.org/3/library/sqlite3.html#sqlite-and-python-types

I'm not sure that's exactly what we're talking about here, but it might be a parallel with some useful ideas to borrow.

simonw commented 2 years ago

Hah, that's interesting - I've never used that mechanism before so it wasn't something that came to mind.

They seem to be using a pretty surprising trick there that takes advantage of SQLite allowing you to define a column "type" using a made-up type name, which you can then introspect later.

eyeseast commented 2 years ago

I've never used it either, but it's interesting, right? Feel like I should try it for something.

I'm trying to get my head around how this conversions feature might work, because I really like the idea of it.

simonw commented 2 years ago

I have an idea for how that third option could work - the one that creates a new column using values from the existing ones:

db["places"].insert(
    {
        "name": "London",
        "lng": -0.118092,
        "lat": 51.509865,
    },
    conversions={"point": LongitudeLatitude("lng", "lat")},
)

How about specifying that the values in that conversion= dictionary can be:

Then you could do this:

db["places"].insert(
    {
        "name": "London",
        "lng": -0.118092,
        "lat": 51.509865,
    },
    conversions={
        "point": lambda row: LongitudeLatitude(
            row["lng"], row["lat"]
        )
    }
)

Something I really like about this is that it expands the abilities of conversions= beyond the slightly obscure need to customize the SQL fragment into something that can solve other data insertion cleanup problems too.

simonw commented 2 years ago

I'm going to write the documentation for this first, before the implementation, so I can see if it explains cleanly enough that the design appears to be sound.

eyeseast commented 2 years ago

What if you did something like this:


class Conversion:
    def __init__(self, *args, **kwargs):
        "Put whatever settings you need here"

    def python(self, row, column, value): # not sure on args here
        "Python step to transform value"
        return value

    def sql(self, row, column, value):
        "Return the actual sql that goes in the insert/update step, and maybe params"
        # value is the return of self.python()
        return value, []

This way, you're always passing an instance, which has methods that do the conversion. (Or you're passing a SQL string, as you would now.) The __init__ could take column names, or SRID, or whatever other setup state you need per row, but the row is getting processed with the python and sql methods (or whatever you want to call them). This is pretty rough, so do what you will with names and args and such.

You'd then use it like this:

# subclass might be unneeded here, if methods are present
class LngLatConversion(Conversion):
    def __init__(self, x="longitude", y="latitude"):
        self.x = x
        self.y = y

    def python(self, row, column, value):
        x = row[self.x]
        y = row[self.y]
        return x, y

    def sql(self, row, column, value):
        # value is now a tuple, returned above
        s = "GeomFromText(POINT(? ?))"
        return s, value

table.insert_all(rows, conversions={"point": LngLatConversion("lng", "lat"))}

I haven't thought through all the implementation details here, and it'll probably break in ways I haven't foreseen, but wanted to get this idea out of my head. Hope it helps.

simonw commented 2 years ago

My hunch is that the case where you want to consider input from more than one column will actually be pretty rare - the only case I can think of where I would want to do that is for latitude/longitude columns - everything else that I'd want to use it for (which admittedly is still mostly SpatiaLite stuff) works against a single value.

The reason I'm leaning towards using the constructor for the values is that I really like the look of this variant for common conversions:

db["places"].insert(
    {
        "name": "London",
        "boundary": GeometryFromGeoJSON({...})
    }
)
simonw commented 2 years ago

The CLI version of this could perhaps look like this:

sqlite-utils insert a.db places places.json \
  --conversion boundary GeometryGeoJSON

This will treat the boundary key as GeoJSON. It's equivalent to passing conversions={"boundary": geometryGeoJSON}

The combined latitude/longitude case here can be handled by combining this with the existing --convert mechanism.

Any Conversion subclass will be available to the CLI in this way.

eyeseast commented 2 years ago

Yeah, the CLI experience is probably where any kind of multi-column, configured setup is going to fall apart. Sticking with GIS examples, one way I might think about this is using the fiona CLI:

# assuming a database is already created and has SpatiaLite
fio cat boundary.shp | sqlite-utils insert boundaries --conversion geometry GeometryGeoJSON -

Anyway, very interested to see where you land here.

psychemedia commented 2 years ago

My hunch is that the case where you want to consider input from more than one column will actually be pretty rare - the only case I can think of where I would want to do that is for latitude/longitude columns

Other possible pairs: unconventional date/datetime and timezone pairs eg 2022-02-16::17.00, London; or more generally, numerical value and unit of measurement pairs (eg if you want to cast into and out of different measurement units using packages like pint) or currencies etc. Actually, in that case, I guess you may be presenting things that are unit typed already, and so a conversion would need to parse things into an appropriate, possibly two column value, unit format.

rsyring commented 8 months ago

wonder if there's any overlap with the goals here and the sqlite3 module's concept of adapters and converters: https://docs.python.org/3/library/sqlite3.html#sqlite-and-python-types

For some discussion of converters, see: https://github.com/simonw/sqlite-utils/issues/612