spartanz / schemaz

A purely-functional library for defining type-safe schemas for algebraic data types, providing free generators, SQL queries, JSON codecs, binary codecs, and migration from this schema definition
https://spartanz.github.io/schemaz
Apache License 2.0
164 stars 18 forks source link

Added polymorphic definition for Schema #16

Closed insdami closed 5 years ago

insdami commented 6 years ago

closes #7

vil1 commented 6 years ago

That looks correct, I'd like this to have a test/example to show how it is used though, which means waiting for #15 to be merged.

vil1 commented 6 years ago

15 has been merged. I suppose it's enough to modify the existing example to showcase this new way of creating schema values

insdami commented 6 years ago

When you say schema values, do you mean personSchema? So I can use anything from any given SchemaModule to create a schema using record for instance, right?

My understanding is that a SchemaModule defines 3 things Prim[A], SumTermId and ProductTermId. If I were to replace the existing essentialField call, how could infer just using the given SchemaModule that its ProductTermId is String in this case?

vil1 commented 6 years ago

In the existing test there is a jsonModule that implements SchemaModule with JSON primitives and String for product and sum identifiers.

So I think you can indeed define personSchema with something along the lines of


object PersonSchema extends Schema[Person] {
  def apply(m: SchemaModule): m.Schema[Person] = {
    import m._
    record[Person](...)
  }
}

// inside the test
val personSchema = PersonSchema(jsonModule) 
vil1 commented 6 years ago

So @jdegoes I think we reached a blocking point there.

The problem is basically that from within the apply method's body, we don't know anything about Prim[A], ProductTermId, and SumTermId and we don't have any mean of creating values of these types.

Maybe we can partially solve this problem using the Aux trick and typeclasses:

object SchemaModule {
  type Aux[P[_], R, U] = SchemaModule { type Prim[A] = P[A]; type ProductTermId = R; type SumTermId = S}
}

trait Schema[A] {
  def apply[P[_], R, U](module: SchemaModule.Aux[P, R, U])(implicit P: TC1[P[A]], R: TC2[R], U: TC3[U]): module.Schema[A] 
}

Where TC1, TC2 and TC3 would provide the necessary functionality to build values of types P[A], R and U respectively. But honestly, I'm not sure this is doable (those TCs would need an additional type parameter to tell out of which type we can build these values) nor convenient, nor the path we want to explore.

Therefore, I think we need a little help ^^

jdegoes commented 6 years ago

In the original ticket I wrote:

If now we had a way to abstract over the primitives required by a schema, it would then be possible to have polymorphic schemas (across, e.g., Scala, JSON, XML, etc.).

However, I didn't have a good idea how to do it.

It's possible we could parameterize SchemaModule by constructors and deconstructors, something like:

trait SchemaModule[PrimConstruct[_[_]], PrimDeconstruct[_[_]]] {
  type Prim[A]
  implicit def primConstruct: PrimConstruct[Prim]
  implicit def primDeconstruct: PrimDeconstruct[Prim]
}

Now you can define polymorphic schemas but you can require some ability to construct and deconstruct values.

In the base case, maybe something like:

trait ScalaConstruct[+Prim[_]] {
  def int(v: Int): Prim[Int]
  def bool(v: Boolean): Prim[Boolean]
  ...
}
trait ScalaDeconstruct[-Prim[_]] {
  def fold[Z, A](prim: Prim[A])(int: Int => Z, bool: Boolean => Z, ...): Z
}

Then you modify Schema to pass along the constraints required for construction and deconstruction.

The Deconstruct type classes could have something else, like "is not one of the above", to allow more compatibility. Sometimes you can deal with failure.

The "loosest coupling" deconstructors would look like this:

trait ScalaDeconstruct[-Prim[_]] {
  def toInt[A](prim: Prim[A]): Option[Int]
  def toBoolean[A](prim: Prim[A]): Option[Boolean]
  ...
}

This way you can do things like ScalaDeconstruct[Prim] with JsonDeconstruct[Prim] to compose over multiple deconstructor types. Unfortunately it does mean deconstruction can always fail, which will propagate up into whatever code uses these things. Yet it will be hard to avoid failure anyway, so maybe it's not a big problem.

vil1 commented 5 years ago

Recent changes made this PR obsolete.

@insdami thank you for your time and energy, we hope to see you again soon :)