Closed mb21 closed 1 month ago
We shallow merge objects passed as arguments so not all properties need to be present. It seems odder these days but it was inspired by React class components setState
.
Okay... but why do the docs say:
Objects are always shallowly merged. Set values to undefined to delete them from the Store. In TypeScript, you can delete a value by using a non-null assertion, like
undefined!
.
(emphasis added)
Are you sure the TypeScript definitions are intended to be this way? I can easily write a program that crashes at runtime this way, even though it type-checks.
E.g. this program crashes at runtime even though it type-checks.
It's one if the reasons I want to move away from this APi as default over time, but I don't see how TS could check this properly. What is written there is saying add content
to the object at index 0. When I created the API 6 years ago it was more customary for JS to be forgiving than strict, so failing to find an object at 0 it adds it. That's the issue here but it isn't behavior that we can change without breaking things.
I would just change the type, not the JS implementation. The people that have setStore({ content: 'hi' })
and mean it would need to change it to either setStore('content', 'hi')
or setStore(st => ({ ...st, content: 'hi' }))
, but yes it would be a breaking change.
But if you have a new API to replace stores in the works, I can understand it might not be worth the trouble. Is this new API something we could start using now?
Right now we add modifiers for different APIs. The approach we are looking to embrace more in the future is the produce pattern. https://docs.solidjs.com/concepts/stores#store-updates-with-produce
Thanks. To be honest, I don't have a preference over the two styles (I would, however, prefer the API to be type-safe). But either way, does this advice about not using produce on the entire array still apply then?
My comment was more that shift/unshift can be heavier that just cloning the array and setting it. It is possible now that the most performant update might need a little manual tweaking but I haven't benchmarked it in a while.
FWIW this is more of an index signature issue that's exhibited in a different way by the existing store implementation, and even moving to produce
won't fully solve it. You can never be sure if a specific index contains a value and Typescript always assumes it does i.e.
const messages: Message[] = [];
messages[0].content; // runtime error
@ryansolid haha I'm confused. So how would you append to an array in a store nowadays most idiomatically? (Or my actual use-case would be to replace the last element.)
@otonashixav that can be disabled with noUncheckedIndexedAccess
in tsconfig.json and not actually what this is about. Here I can put in an element in the array that has not all the required fields.
Guess I just learnt about it. I know it's not what it's about, but the reason it's typed as such is related so my mind went there.
I would just change the type, not the JS implementation. The people that have setStore({ content: 'hi' }) and mean it would need to change it to either setStore('content', 'hi') or setStore(st => ({ ...st, content: 'hi' })), but yes it would be a breaking change.
This is while technically possible, unwieldy. It would mean if any union containing two different types with diffierent properties would make the store very difficult to use. While the store is []
(with noUncheckedIndexedAccess
or store: (Message | undefined)[]
), both methods in your examples wouldn't work -- setStore(0, "content", "hi")
and setStore(0, m => ({ ...m, content: "hi" }))
.
Another example: setStore("a", "b", "c", "value")
would not be allowed if any of store.a
or store.a.b
were unions, e.g. { a: undefined | { b: string | { c: string, d: string } } }
. You'd have to use the callback form from the top of the store and typecheck at every union, and also initialize any new objects with some default values if they had any (in this case just d
).
tl;dr it is the way it is for maybe the same reason noUncheckedIndexedAccess
is false
by default today -- because it made lives easier initially.
It is possible to change this behaviour based on the value of noUncheckedIndexedAccess
though. Currently it's done for exactOptionalPropertyTypes
for some types.
Also you're correct in thinking that the types probably match the implementation too closely in this and some other cases.
Yes, I agree that the two cases are somewhat related. However, most TypeScript programmers are now familiar with the default being that the type for array[0]
can be wrong, while the type for object.Field
basically is always right, that's what threw me off.
Anyway, considering things how they're currently implemented: is the following usage of produce
performant?
const [store, setStore] = createStore({ messages: [] })
setStore(produce(st => {
st.messages[st.messages.length - 1] = newMessage
}))
Or is it more idiomatic to first use setStore
to drill down further before applying produce
?
Is this the complete implementation of produce
? If so, the above code seems to be equivalent to:
setStore(st => {
st.messages[st.messages.length - 1] = newMessage
return st
})
(I think producers.set
wouldn't be executed on an update?), which seems very performant. But I assumed you are supposed to return a new array (but perhaps that's just an assumption from my old React days). Or is the drawback that this will produce too many unnecessary reactive updates?
Anyway, considering things how they're currently implemented: is the following usage of
produce
performant?const [store, setStore] = createStore({ messages: [] }) setStore(produce(st => { st.messages[st.messages.length - 1] = newMessage }))
That would be performant.
If I'm not mistaken produce
uses a proxy to capture mutations, then mutates the raw store object and triggers the appropriate signals.
Is this the complete implementation of produce? If so, the above code seems to be equivalent to:
setStore(st => { st.messages[st.messages.length - 1] = newMessage return st })
Keep in mind the parameter from the callback form of setStore
is immutable[^1], and the return value is, like the normal form, shallowly merged if it is a plain object.
In this specific case, I might do setStore("messages", st.messages.length - 1, newMessage)
. You can also append to the array in this way.
[^1]: It's no longer typed that way because applying recursive readonly was a headache. Especially because of self references, generics and classes (which had to not be made readonly, except you can't differentiate classes from objects).
(I think
producers.set
wouldn't be executed on an update?), which seems very performant.
If I'm reading it right, producers.set
is called the first time you use produce
on a particular store or part of a store, then cached until the store is gc'd. In any case neither it nor the creation of a proxy should be a concern for performance imo. If performance was a problem, I'd suggest manually creating a data structure with signals where necessary rather than using stores which turn everything into a signal, but this shouldn't be something to worry about till you have absolutely gargantuan amounts of data.
But I assumed you are supposed to return a new array (but perhaps that's just an assumption from my old React days). Or is the drawback that this will produce too many unnecessary reactive updates?
produce
is designed for mutation. The normal setter function callback is where you want to use immutable operations. I don't know how it works exactly so I don't know if a new array would result in unnecessary updates, but usually I try to apply updates as far down as possible.
Describe the bug
In the following snippet, there is no TypeScript error althought there should be:
Your Example Website or App
https://playground.solidjs.com/anonymous/63d9a4f5-bad3-4bf1-b577-ae395f765047
Steps to Reproduce the Bug or Issue
Run
tsc
on provided code.Expected behavior
There should be a type error.
Platform
TypeScript 5.4.5
Notes
The type definition seems to be from this line. More specifically, why can the StoreSetter be a
CustomPartial<T>
?