Closed moea closed 9 years ago
Oh, also, not using java.sql.Array
was intentional, it didn't seem that relevant
Thanks for the pr! I'll take a look, array support is definately something I want to add.
Is there anything I could do to make it easier for you to take a look at including this?
Sorry for the delay, I really am interested in getting this merged.
Here are some ideas:
<T> T[] getArray(String column, T typeClass)
.int[][] ids = row.get("ids", int[][].class)
.DataConverter could maybe look something like this:
// DataConverter.java
public <T> T toArray(Optional<Class<T>> dimensionalArrayType, Oid oid, byte[] value):
switch(oid) {
case INT2_ARRAY:
case INT4_ARRAY:
case INT8_ARRAY:
return ArrayConversions.toArray(dimensionalArrayType, oid, value, NumericConversions::toLong);
ArrayConversions would then handle parsing of (nested) '{' and '}' and calling the supplied function to create an object from the string value. The dimensionalArrayType parameter (eg. int[][].class) can be missing since the clojure version doesn't care about types.
How does this sound to you?
Hey - I'll need to re-familiarize myself with the code, but it sounds good to me. I'll take a look at making the changes this week.
I updated this to be in line with your comments. Thanks a bunch for the feedback.
Some notes/clarifications:
pgjdbc
's fundamental implementation strategy.DataConverter
to do as you suggested, and added clauses for the handling of arrays of all of the currently-supported types - with the exception of bytea[]
. As the converter for that function returns a byte[]
, and buildArray
gets its depth cues only from the databases values, this would have to be handled with some pretty ugly special-casing, I believe.DataConverter.toArray
, e.g. Long[][]
amounts basically to a runtime assertion about the return values of the converter which was delegated to for individual elements. I implemented your example above of delegating INT2
, INT4
and INT8
elements to NumericConversions::toLong
, however this means that users must call getArray(..., Long[].class)
, even though it may not be immediately obvious that an INT2
would map to a long (because it doesn't for single values). Maybe the types should be symmetric?ArrayConversions
) is typed as returning Object
. I think this could only be more specific if toArray
was parameterised over the type of the elements, as well as the type of the array, but I don't think it matters.Class.isArray
is the clause in fromObject
which determines whether to use ArrayConversions
- accordingly, primitive arrays can be inserted.buildArrayList
. If/when everything else makes sense, I can have a look at simplifying it.I'm far from comfortable with Java's type system when things get complicated, so feel free to tell me how weird this is.
P.S. isArray()
as the fromObject
guard is too specific, if we want to be able to insert regular collections. There could be another clause which looks at interfaces.
This looks good.
Commit the fixes and this is ready for merge.
I will probably at some point revisit the decision of using byte[] as the internal data format, the getBytes/fromBytes shuffing is pretty ugly but there is no way around it in the current api.
Done. I added a few more tests for the changes.
Merged, thanks!
Hey - thanks again for this library.
This branch implements conversion of n-dimensional arrays going into and out of the database, with the help of some modified parsing code ripped out of pgjdbc.
I don't know whether this is a feature you're interested in incorporating, however either way I'd very much appreciate some feedback on how best to integrate it, either as a feature or a separate library.
High level stuff:
There's an object used for both insertion and retrieval,
PgArray
, which is given either an array (for insertion), which it converts to a string using a pretty simple and predictable couple of methods, or is given a string (for parsing) and an OID, and returns anObject[]
(arbitrarily nested), where the leaves are the result of delegating the element parsing toDataConverter
.That's the main sticking point as far as integration - the array converter's constructor accepts a
DataConverter
(which is constructed vanilla in the tests, just to get the thing working), but ideally we want it to have access to the pool's converter, so it can parse arrays of types which are themselves converted by user code.I wrote this for use from Clojure/
postgres.async
, but touchingDataConverter
seemed like the wrong thing to do, and I think I may be missing something as far as how, in Java, it would practical to have the.get
which happens during the CLJ map conversion correctly invokeArrayConverter
. Obviously if this feature ends up in your library, it would be possible to add more clauses to the switch intoObject
Anyway, I'm super open minded about making changes, etc. - so please let me know if you can imagine any problems.
Todo: It'll be a quick change, but I think strings are constructed in a few places without passing in a codec, which could cause problems. Will revisit tomorrow.
More notes:
The largest changes I made to the
pgjdbc
utilities are as follows:Object[]
. Without having a mapping from OIDs to converter return types (including types constructed by custom converters), I think the only other approach to this would be to defer array creation until we can reflect the return type. I'm open-minded about this.NULL
as an array element. I removed the version checking, so theNULL
will go to the DB regardless of its version. It works fine both in-and-out post-8.3.