Open minedeljkovic opened 8 years ago
Hi @minedeljkovic, thanks to chime in.
tcomb-validation - this seems already pretty much covered by this repo
Yep, given that a type in flow-runtime is defined by a validate
function, we already have pretty much the core of tcomb + tcomb-validation
tcomb's fromJSON - this doesn't seem to be much work to just port to flow-runtime
I'm working on it right now (also suggested by @ivan-kleshnin on gitter). The progress is tracked by this PR https://github.com/gcanti/flow-runtime/pull/9 It's not so easy because of the types, I'll post a proposal for feedback when I got something which barely works
tcomb-form - this one is more serious beast :)
I agree, I'm afraid will be a pain :|
Speaking of runtime type introspection I'd love to hear your feedback on how to model the runtime types. Now a type is a simple POJO owning a name
and a validate
field. But then we have several subtypes with additional fields. For example an ArrayType
owns an additional type
field. The problem comes when I need to refine a generic type in order to do runtime type introspection. Say you have a generic function
function f<T>(type: Type<T>): void {
// here I want to refine type
}
It's not clear how I can refine type
without going mad or, even worse, requiring an unsafe cast:
function f<T>(type: Type<T>): void {
if (type.kind === 'object') {
// ObjectType
}
}
throws "property
kind. Property not found in object type
". Then
function f<T>(type: Type<T>): void {
if (typeof type.kind !== 'undefined') {
const kind = type.kind
if (kind === 'object') {
const props = type.props // throws `property `props`. Property not found in object type`, etc...
// even if I add here typeof type.props !== 'undefined'
// it's unsafe to state that type is an ObjectType...
}
}
}
Would be a bad idea to convert the POJOs in class
es in order to use instanceof
? Any better ideas?
Hacking quickly :) Woud something like this (flow try) help?
Edit: I obviously incorrectly used CommonType
Would be a bad idea to convert the POJOs in classes in order to use instanceof?
It will simplify some things for you as a library author but complicate many things for library users. For example, value wrapped in class
can no longer be passed to (NoSQL) database driver freely (in general case). Every write operation will have to be preceded with manual cast then.
I would prefer to stay with structural typing if it's possible. To my regret, I'm only starting with Flow so I don't really have enough tech competence for implementation-specific advices.
Still, I believe nothing is exactly safe in JS (instanceof
does not guarantee props weren't removed or replaced with something different, weren't made write-only, etc, etc.) so duck typing is okayish solution.
@minedeljkovic adding CommonType
to the union seems to confuse Flow try
@ivan-kleshnin I'm talking about how to represent the types, values will remain untouched. I mean
export class Type<T> {
name: string;
validate: Validation<T>;
constructor(name: string, validate: Validation<T>) {
this.name = name
this.validate = validate
}
};
instead of
export type Type<T> = {
name: string;
validate: Validation<T>;
};
validate
is a function so runtime types seem already hard to serialise
@gcanti I understood CommonType
(previously Type
) was just a common interface all concrete types (Literal, Irreducible, ObjectType...) should implement and not a type itself. Givent that, it should not be part of union. Did I get it wrong?
Yes, it's a possibility. But I'm concerned by that error, Flow is quite buggy when mixing unions and intersections
I'll try to convert the types to classes in another branch and see if there are benefits.
Anyway I would prefer to keep the current representation as well, I'll try to find a way to fix the problem with the union.
I believe this concrete union would be discriminated over kind property, and in my experience disjoint unions are pretty stable in Flow. Also new empty type works nicely with this, so any matching on kind would be easily checked for exhaustiveness. I think that is a bonus
To give some context I'm trying to model the new fromJSON
as a concatenation of a validation and a transformation that could fail
import * as t from '../src/index'
// a transformation can fail
export type Transformation<A, B> = (a: A, c: Context) => ValidationResult<B>;
// helper
export function transformWithContext<A, B>(value: mixed, context: Context, type: Type<A>, transformation: Transformation<A, B>): ValidationResult<B> {
return t.chain(t.validate(value, type), a => transformation(a, context))
}
// real API
export function transform<A, B>(value: mixed, type: Type<A>, transformation: Transformation<A, B>): ValidationResult<B> {
return transformWithContext(value, t.getDefaultContext(type), type, transformation)
}
interface Transformer<A, B> extends Type<A> {
transformation: Transformation<A, B>
}
// this function is the problem
declare function getTransformation<A, B>(type: Type<A>): Transformation<A, B>;
The rationale is this: say you want to deserialise a date from a string, then first you must ensure that the value that comes from the server is a string
// return right(jsonValue) if jsonValue is, for example, '2016-10-16T08:42:54.305Z'
t.validate(jsonValue, t.string)
if successful you pass the right to the transformation, which in turn can fail, for example if jsonValue = 'blah'
// now you know that s is a string but the parsing can still fail
(s: string, c) => {
const d = new Date(s)
if (isNaN(d.getTime())) {
return t.failure(s, c)
}
return t.success(d)
})
It's worth noting that the signature of validate
(value: mixed, context: Context) => ValidationResult<T>
could theoretically allow for transformations while validating, but yesterday after a spike I found the following problems
Sorry for spamming but on my way home I had a clear vision: my model is plain wrong. The "failure part" must be completely absorbed by the first step, the validation. Using the example above
// bad
// t.validate(v, t.string)
// good
const jsonDate = t.refinement(t.string, s => !isNaN(new Date(s).getTime()), 'jsonDate')
t.validate(v, jsonDate)
and then apply a simple map
const fromJson = v => t.map(t.validate(v, jsonDate), s => new Date(s))
console.log(fromJson(1)) // Left
console.log(fromJson('blah')) // Left
console.log(fromJson('2016-10-16T08:42:54.305Z')) // Right
This is a simple case, but we want to handle arrays, tuples, objects, etc...
So, given a type T = Type<A>
the real problem here is that somehow we want to build a generic mapping form A
to B
while keeping the structure of A
and leveraging the runtime type informations contained in T
I think that I'll take a pause from the keyboard and go learn about lenses (and optics in general), perhaps it can be helpful.
I'll try to convert the types to classes in another branch and see if there are benefits.
I put up a branch (types-as-classes
), no crucial benefits so far
It's worth noting that the signature of validate
(value: mixed, context: Context) => ValidationResult<T>
could theoretically allow for transformations while validating
What if we allow for such transformations and the "merger" is controlled by the user?
In order to handle an intersection of n
types one option would be to feed the i
-th validate
function with the Right of the (i-1)
-th validation.
Does it make sense?
Ok, here you can find an implementation of the idea above: branch validate
Seems to work pretty well, you can do
import type {
Type
} from 'flow-runtime'
import * as t from 'flow-runtime'
// define a custom date type
const jsonDate: Type<Date> = {
name: 'jsonDate',
validate: (v, c) => t.chain(s => {
const d = new Date(s)
if (isNaN(d.getTime())) {
return t.failure(s, c)
}
return t.success(d)
}, t.string.validate(v, c))
}
console.log(t.validate(1, jsonDate)) // Left
console.log(t.validate('bad', jsonDate)) // Left
console.log(t.validate('1973-11-30T08:42:54.305Z', jsonDate)) // Right<Date>
// use the custom type in a product type
const Person = t.object({
name: t.string,
birthDate: jsonDate
})
const validation = t.validate({
name: 'Giulio',
birthDate: '1973-11-30T08:42:54.305Z'
}, Person)
t.map(person => console.log(person.birthDate instanceof Date), validation) // => true
@gcanti is it still possible in this scheme to assign 1
and 0
to be valid (unparsed) booleans?
– To validate and convert, say, query string to properly typed values true
and false
.
@ivan-kleshnin not sure, do you mean something like this?
import qs from 'querystring'
const querystringBoolean: Type<boolean> = {
name: 'querystringBoolean',
validate: (v, c) => t.chain(s => {
if (s === '0' || s === '1') {
return t.success(s === '1')
}
return t.failure(s, c)
}, t.string.validate(v, c))
}
const Qs = t.object({
a: querystringBoolean,
b: t.string,
c: querystringBoolean
})
const obj = qs.parse('a=1&b=hello&c=0')
t.map(qs => console.log(qs), t.validate(obj, Qs)) // => {a: true, b: "hello", c: false}
@gcanti yes, exactly.
The thing I like the most of this implementation is that really honours the signature of validate
(no more weird "validate MUST return value if validation succeeded" comment), you are free to define your types and (optionally) do a transformation
...and in the above parse
is user-defined Qs.prototype.parse
I guess?
@ivan-kleshnin in my example it comes from the 'querystring'
package
Right. I reread the entire thread and I think I get the whole picture now. I'm going to practice with this branch on weekends.
or should it be usable as is with some sort of tcomb <-> flow-runtime integration
@minedeljkovic something like this should work (and the additional computation should be negligible)
import t from 'tcomb-form'
import * as io from 'flow-io'
function toTcomb(ty) {
if (ty instanceof io.ObjectType) {
const props = {}
for (const k in ty.props ) {
props[k] = toTcomb(ty.props[k])
}
return t.struct(props)
}
if (ty instanceof io.ArrayType) {
return t.list(toTcomb(ty.type))
}
if (ty instanceof io.RefinementType) {
return t.refinement(toTcomb(ty.type), ty.predicate)
}
if (ty === io.string) {
return t.String
}
if (ty === io.number) {
return t.Number
}
if (ty === io.boolean) {
return t.Boolean
}
if (ty instanceof io.UnionType) {
if (ty.types.every(t => t instanceof io.LiteralType)) {
return t.enums.of(ty.types.map((t: any) => t.value))
}
return t.union(ty.types.map(toTcomb))
}
throw `Unsupported type ${ty.name}`
}
Thanks, @gcanti! That's exactly what I was thinking of when talking about tcomb <-> flow-runtime integration.
In the meantime, I discovered what a great job you did with tcomb flow defs [here].(https://github.com/gcanti/pantarei/blob/master/tcomb/3.x.x-0.33.x/tcomb.js) (I must confess I am still wrapping my head around $ObjMap trickery, but I am catching up slowly :) ). It provides some great things on the project I am using tcomb on! Two examples:
fromJSON
over my reified api type. This is where I was loosing type checking, but since you improved flow defs, not anymore :)I would be very grateful if you can quickly summarize what are the core advantages of flow-io
(ex. flow-runtime
) over tcomb
with these awesome flow defs. (aside from much cleaner validation api, of course). As much as I can follow, flow techniques that you are using in these two are pretty much the same.
In the meantime, I discovered what a great job you did with tcomb flow defs
@minedeljkovic Ah thanks for the feedback! I'm glad is helpful, I've "lost" several hours on that libdef! :)
what are the core advantages of flow-io over tcomb
tcomb
is a 2 years old project and was first conceived 4-5 years ago. It's stable and mature (I don't plan to add anything else). During these 2 years though, I accumulated some remarks on its design decisions which I'd like to fix, namely
If you are thinking: "but you are always iterating on the same things!" well.. you are right :) it's a kind of "obsession" in order to get the ultimate goal: having a single source of truth, the types
I must confess, I have never accomplished this in any other technology!
^ for instance this makes me really happy
Then I tried with babel-plugin-tcomb but is too hacky.
My intention with flow-io
was to consolidate all my experiments in a single library with an ambitious goal: to build a bridge between runtime and static type through "static type extraction" (the TypeOf
operator).
Alas this issue seems a show stopper https://github.com/gcanti/flow-io/issues/11 though I still think that flow-io is a good library.
Another difference is that tcomb is free to implement whatever we think is useful in a dynamic world, while flow-io will be strictly compatible with Flow, even if this means to reject some useful (but not typeable) features.
it's a kind of "obsession" in order to get the ultimate goal: having a single source of truth, the types
This is also the reason why I'm studying full time functional programming, category theory, PureScript and now Scala. I'd love to write programs in a functional programming language with an advanced type system.
First, I have to say I really like this approach to bringing runtime and static types closer!
I can only agree with @gcanti on what he stated here, that converting runtime type to a static type is more reliable, than other way around. Runtime type construction is more flexible and rich with options (runtime refinement is one obvious example), so it seems that converting from that source to more rigid static types is the only way for converting to be lossless.
On our project we are using babel-plugin-tcomb for runtime introspection, and we already have to use workarounds for some scenarios that I believe would not be needed with flow-runtime.
@gcanti , I would greatly appreciate to here from you, what are your plans or vision regarding rich runtime introspection implementations like the ones in tcomb ecosystem, i.e.: