Closed yrashk closed 8 years ago
Hi @yrashk,
I'm not fond of default values:
tcomb
to be inflexible and to throw at me as much as it canvalue
level, i.e. before passing to struct constructorsconsole.log(new Person(Object.assign({}, Person.defaultValue)))
It should be quite easy to write a general function handling default values, see this and this
So at the moment my position on the subject is "I won't add default values to the core", but I'm open to discussions.
I'm not acquainted with reactuate so forgive me if the following considerations are not accurate. AFAIK you can bypass the (unfortunate) redux plain object check nesting the tcomb type
import t from 'tcomb'
import { createStore } from 'redux'
const FSAError = t.maybe(t.refinement(t.Boolean, (n) => n === true, 'FSAError'))
const actions = {} // store the union of actions
function getActionCreator(payloadType, metaType = t.Any) {
const type = payloadType.meta.name // using the payloadType name as type constant
actions[type] = payloadType // store the type in the union
return (payload, error, meta) => {
return {
type: type,
payload: payloadType(payload), // type-check payload
error: FSAError(error), // type-check error
meta: metaType(meta) // type-check meta
}
}
}
// automatically create the reducer
function getReducer(initialState) {
return (state = initialState, action) => {
if (action.payload && actions.hasOwnProperty(action.type)) {
return action.payload.update(state)
}
return state
}
}
const store = createStore(getReducer({ counter: 0 }))
// define actions and actions creators
const Increment = t.struct({ increment: t.Number }, 'Increment')
Increment.prototype.update = function(state) {
return t.update(state, { counter: { $set: state.counter + this.increment } })
}
const increment = getActionCreator(Increment)
// go
console.log(store.getState())
store.dispatch(increment({ increment: 2 })) // no type required!
console.log(store.getState())
This is along the lines of redux-tcomb
I understand your point. I am just trying to reduce the amount of ceremony and confusion, while retaining the awesome functionality offered by tcomb.
In the above example, you're doing this:
const Increment = t.struct({ increment: t.Number }, 'Increment')
const increment = getActionCreator(Increment)
Which creates two entities for one action. This nearly doubles the amount of code one needs to write. The more code, the more error-prone any application is. For example, I write my reducers using tomb's match, and I need to always remember to use Increment
in the match pattern instead of increment
(I'd rather have no ambiguity). The reason why I am matching instead of adding an update
function like in your example is I believe the action/action creator should not know anything about the state.
As a matter of fact, while writing this response, I just decided to build my own ActionCreator type in order to avoid hijacking t.struct. Perhaps, this is a much better solution!
https://github.com/reactuate/reactuate/commit/05a431a4de514c9a5b065d625d348d2820fd206a
I see. Writing a custom combinator is certainly a good thing when you are facing a specific problem like yours.
A couple of observations:
// this assert seems redundant due to `value = payload(value)` a few lines above
t(payload.is(value), function () {
return `Invalid value ${t.stringify(value)} supplied to ${actionString}(${ path.join('/')}) (expected ${payload.displayName})`;
})
...
if (typeof metaValue !== 'undefined') {
console.log(metaValue); // <= there's a left `console.log`
this.meta = metaValue
}
Thanks for the observations!
Hi @yrashk,
Not sure if this is still relevant for you, but I'm changing my mind. This is not a valid argument:
I prefer explicit rather than implicit. I want tcomb to be inflexible and to throw at me as much as it can
Being an optional and additional feature, nothing prevents me to not use default values, but they can be useful to other users like you.
So this is my analysis so far:
Description
Currently you can't define default props for struct
s:
const Person = t.struct({
name: t.String,
surname: t.maybe(t.String),
tags: t.list(t.String)
}, 'Person')
Person({ name: 'Giulio' }) // => throws [tcomb] Invalid value undefined supplied to Person/tags: Array<String> (expected an array of String)
As you suggested, we could define a defaultProps
setting:
const Person = t.struct({
name: t.String,
surname: t.maybe(t.String),
tags: t.list(t.String)
}, 'Person')
Person.meta.defaultProps = { tags: [] }
Person({ name: 'Giulio' }) // => { name: 'Giulio', tags: [] }
Requirements
meta
objectConsequences on other libraries
How to set the new setting?
Not sure. There are at least 2 options:
1) manually set Person.meta.defaultProps = {...}
const Person = t.struct({
name: t.String,
surname: t.String
}, 'Person')
Person.meta.defaultProps = { tags: [] }
2) (my preference) change struct signature from (props, [name])
to (props, [name | options])
where options = { name: maybe(String), defaultProps: maybe(Object) }
.
const Person = t.struct({
name: t.String,
surname: t.String
}, { name: 'Person', defaultProps: { tags: [] } })
3) other?
/cc @ahdinosaur (being author of https://github.com/ahdinosaur/tcomb-defaults)
keen! tcomb-defaults
is a complete hack using a func
type as a wrapper around the struct, it has issues when trying to subtype or use types within types and then gives cryptic error messages, so having first-class support would be so much better.
regarding the options, will 2 also set meta.defaultProps
as in 1 which means they both will have the same effect? and 2 sounds good, i'm not sure any better way as adding an additional arity argument is a popular recipe for :spaghetti:.
extending core types meta seems like another possible approach... would any of the options below be acceptable?
const Person = t.struct({
name: t.String('Tom'), //nice, but breaking
lastName: t.StringWithDefault('Simpson').
age: t.withDefault(t.Number, 10)
}, 'Person')
or
const Name = t.refinement(String)
.with((s) => s.length > 0 && s.length < 20
.with('defaultValue', "Tom");
const Person = t.struct({
name: Name,
age: t.Number
}, 'Person')
2) (my preference) change struct signature from (props, [name]) to (props, [name | options])
note it's already [name | options]
due to the strict option:
const Point = t.struct({
x: t.Number,
y: t.Number
}, { name: 'Point', strict: true });
The meta.defaultProps
idea is great, really looking forward to this.
What about defaults for non-structs? e.g. dicts or even irreducibles?
Regarding the proposed tcomb API for defaults, suppose we are starting with this Point
type:
const Point = t.struct({
x: t.Number,
y: t.Number
}, 'Point');
With the options.defaultProps
proposal, configuring defaults converts the tidy few lines above to this rather verbose chunk below:
const Point = t.struct({
x: t.Number,
y: t.Number
}, {
name: 'Point',
defaultProps: {
x: 0,
y: 0
}
})
An alternative API that reduces verbosity a little, would be to supply defaults as the third argument. Let's call it option 3:
const Point = t.struct({
x: t.Number,
y: t.Number
}, 'Point', {
x: 0,
y: 0
})
The little 'Point'
in the middle between the definition and the defaults in option 3 is a little weird I must admit.
Perhaps the type name could come last:
const Point = t.struct({
x: t.Number,
y: t.Number
}, {
x: 0,
y: 0
}, 'Point')
Alternatively it could support both of the above with "smart" argument handling logic like:
Where the defaults and the type name string could be in either position. This may seem a little too "jquery" though.
Another alternative is to take a similar approach to tcomb-defaults
and use a wrapper function that takes a type and the default values to set, and returns a new type. e.g.
const Point = t.defaults(t.struct({
x: t.Number,
y: t.Number
}, 'Point'), {
x: 0,
y: 0
})
Option 4 seems simplest to me and steps around all the above API questions. In addition it grants added flexibility allowing users to compose types with the same structure yet with different default values.
Hi @Srdjan @timoxley thanks for chime in.
What about defaults for non-structs
Actually I'm not even sure is doable for structs :)
I mean, I'm willing to deal with this problem, since my previous position on the subject was admittedly weak, and I thought about this issue for a few days but I'm not yet sure we can come up with a clean solution.
Here my doubts / findings:
isMissing
If we speak about default props we are also speaking about a function, let's call it isMissing
, which tells us if a default prop must be applied:
isMissing: (Object, String, Type) -> Boolean
but how is it implemented?
isMissing({}, 'name', t.String) = true
isMissing({ name: undefined }, 'name', t.String) = ?
isMissing({ name: null }, 'name', t.String) = ?
isMissing({ name: undefined }, 'name', t.maybe(t.String)) = ?
isMissing({ name: null }, 'name', t.maybe(t.String)) = ?
API
I think that it's safer to define the defaults props when you define the type. Given that the signature is already:
struct(props, options)
I'd stick with that, even if slightly verbose:
const Point = t.struct({
x: t.Number,
y: t.Number
}, {
name: 'Point',
strict: true,
defaultProps: {
x: 0,
y: 0
}
})
tcomb-validation
tcomb-validation
should account for this change so:
const Point = t.struct({
x: t.Number,
y: t.Number
})
validate({}, Point) // => KO
const Point = t.struct({
x: t.Number,
y: t.Number
}, {
defaultProps: {
x: 0,
y: 0
}
})
validate({}, Point) // => ok!
That means that tcomb
should export at least a new API that allows a user getting the defaults applied (the isMissing
function above or a applyDefaultProps(value, type)
?).
Other types
It's not clear to me what are the benefits of extending default values to other types.
Moreover these days, together with @R3D4C73D, we are woking on a new module isSubsetOf Spec useful to determine whether it's valid to assign a value of type A to a variable of type B. Extending default values to other types seems a nightmare for that module...
Defaults do create a bunch of questions, it's true.
how is
isMissing
implemented?
It might make sense for tcomb
to just use tcomb.isNil
, but IMO isMissing
should probably just be === undefined
. This is how defaults are applied in ES6+. Another alternative would be to set to default any time a property is invalid. Perhaps it makes most sense for isMissing
to be able to be user-supplied, with either isNil
or === undefined
set as the default. Now, does this apply globally or on a per-type basis? It should probably work like strict
handling… where you can override a global value tcomb.struct.isMissing
or override per-type.
Thinking about this… makes me realise that the current global overridability of strict
is kind of broken… changing the global tcomb.struct.strict
handling will maybe affect your dependencies' global strict handling – it totally depends on whether npm install
(and/or your frontend build tool) decided the package gets its own tcomb
or a tcomb
shared with other dependencies…
Anyway…
I'm not sure if tcomb-validate
taking into account defaults e.g. validate({}, Point) // => ok!
is a good idea. If an tcomb-validate
determines an object to be a valid struct, then it would imply that the object will have all of the struct's fields set and valid, otherwise, what is the point of validating the object?
e.g.
const a = {}
validate(a, Point) // ok
// we know Point has an x field, however:
a.x // undefined i.e. wtf
A simple model for defaults would be to have them only apply when invoking the constructor.
That simplifies the validate
concern, it should only look at the current state of the object. If a user wants defaults applied they should validate(new Point(point))
.
That means that tcomb should export at least a new API that allows a user getting the defaults applied
This could just be the constructor.
However, there's also a question of how defaults work with the immutable helpers. I'd expect defaults to apply.
const point2 = Person.update(point, {
x: { $set: undefined }
});
point.x // 0
Actually, thinking about the t.maybe
type, I'd like to reverse my position on isMissing
being configurable or === undefined
isMissing
should use the same logic as t.maybe
.
Thinking about this… makes me realise that the current global overridability of strict is kind of broken
@timoxley Good point. The global strict
setting should not be used.
@timoxley The problem with tcomb-validate
is its semantic: when you call validate(value, type)
it answers the question: "can value
become an instance of type
?" or, in other words, "can I safely call the constructor type(value)
without getting type errors?". It even returns an instance of type
if the validation succeeded (ValidationResult). Keeping that semantic means it must account for default values.
If a user wants defaults applied they should validate(new Point(point))
It would throw if point
is not valid, defeating the point of validating.
However, there's also a question of how defaults work with the immutable helpers. I'd expect defaults to apply
update
internally calls the constructor, so this should be ok.
IMO isMissing should probably just be === undefined. This is how defaults are applied in ES6+
I agree, this seems the more sensible choice.
Last topic (hopefully): the extend
function should handle the new defaultProps
option:
var A1 = t.struct({ a: t.String }, { defaultProps: { a: 'default-a' } });
var B1 = t.struct({ b: t.String }, { defaultProps: { b: 'default-b' } });
var C1 = t.struct.extend([A1, B1, { z: t.Number }], { defaultProps: { z: 1 } });
assert.deepEqual(C1.meta.defaultProps, {
a: 'default-a',
b: 'default-b',
z: 1
});
assert.deepEqual(C1({}), {
a: 'default-a',
b: 'default-b',
z: 1
});
Candidate implementation: https://github.com/gcanti/tcomb/pull/211
I am wondering if you would consider adding a default value to struct constructors, something like this:
My primary motivation at this moment is that I use tcomb to define redux actions and action creators, and I have to do some type-copying black magic to be able to create
{type: 'NAME', ...}
actions when invokingactionCreator()
as opposed toactionCreator({type: 'actionCreator'})
, and I am really not fond of that black magic approach.P.S. I handle the issue of tcomb types serializability in a middleware and my own reducer wrapper (just in case if you're wondering)