// Dob
@observable
class Store {
user = {
articles: new Map()
// name no longer needs to be initialized
}
addArticle() {
this.user.articles.set('Harry Potter', {
price: 59
})
}
changeName() {
this.user.name = 'Harry Potter'
}
}
observe(() => {
store.user.articles.get('Harry Potter').price
store.user.name
})
3 store manager management issues
When using redux, a lot of time can not distinguish whether the structured data will be flattened, and then subscribe separately, or can not tell after the subscription data processing should be on the component or the overall.
This is because redux undermines the react fractal design, as discussed in "Recent Discussion Record."
Many redux-based fractal solutions are "pseudo" fractal, secretly use replaceReducer to do some dynamic reducer registration, and then bound to the global.
However, frameworks like Mobx and dob are truly fractal, let's start with the question of how to manage the store manager.
How to manage store
The so-called best practices are based on a convention or constraint that makes code readable and maintainable. Agreed is flexible, non-compliance is okay, constraints are mandatory, you can not run without them. Most of the constraints provided by the framework, such as open strict mode, the prohibition of modification of variables outside the Action. However, the most entangled places are still conventions. I came up with a set of usage conventions that use for this kind of responsive store manager.
The use of store manager, the first thing to do is to manage the data, to solve where the Data store on, and whether it is necessary to use store manager.
Whether to use store
First and foremost, the simplest component certainly does not need store manager. Well, when the component is complex, if the data stream itself has a fractal function, then it's available. The store manager with fractal function, can avaliable a new component that combined with react and store manager, with fractal capability:
import {combineStores, observable, inject, observe} from 'dob'
import {Connect} from 'dob-react'
@observable
class Store {name = 123}
class Action {
@inject (Store) store: Store
changeName = () => {this.store.name = 456}
}
const stores = combineStores ({Store, Action})
@Connect (stores)
class App extends React.Component <typeof stores, any> {
render () {
return <div onClick = {this.props.Action.changeName}> {this.props.Store.name} </ div>
}
}
ReactDOM.render (<App />, document.getElementById ('react-dom'))
Dob is such a framework, in the above example, click on the text can trigger refresh, even if no Provider in this root DOM node. This means that this component, independent of any environment, can run as part of any project. Although this component uses the store manager, but no difference with the ordinary React components, you can rest assured that use.
If it is a pseudo-fractal data stream, ReactDOM.render may require a specific Provider to work with, then this component does not have the ability to migrate. If someone else unfortunately installed this component, you need to install a family bucket at the root of the project.
Q: Although components with store manager have full fractal capabilities, there is an invisible dependency on it if this component responds to observable props.
A: Yes, if a component requires that the received props be 'observable' in order to automatically rerender when it changes, that part of the component's functionality will expire when an environment passes normal props. In fact, props belongs to react's universal connection bridge, so the component should only rely on the props of ordinary objects, the internal can then 'observable' it to have a complete ability to migrate.
How to use store
Although React can be fully modularized, modules in actual projects must be divided into non-business related components and business-logic related components, and page modules can also be used as business-logic related components. Data-driven complex website is better, since it is data-driven, then the business-logic related components and data can be moved to the top management of the connection, usually through the top of the page package Provider implementation:
import {combineStores, observable, inject, observe} from 'dob'
import {Connect} from 'dob-react'
@observable
class Store {name = 123}
class Action {
@inject (Store) store: Store
changeName = () => {this.store.name = 456}
}
const stores = combineStores ({Store, Action})
ReactDOM.render (
<Provider {... store}>
<App />
</ Provider>
, document.getElementById ('react-dom'))
Just changed the position of the definition of the store, and the way components are used remains unchanged:
One difference is that @Connect does not need to be parameterized, because if Provider is registered globally, it will be passed through by default toConnect. Contrary to fractal, this design can lead to components not being able to migrate to other projects, but the benefits are that they can be moved anywhere in this project.
Fractal components are strongly dependent on the file structure, as long as the desired props are given the ability to do so, whereas the components of the global store manager are almost independent on the file structure and all the props are taken from the global store.
In fact, here, you can find these two points is difficult to merge into one, we can pre-divided into two components business and non-business coupling, business-logic related components rely on global store manager, so that non-related coupling components to maintain Fractal ability.
If you have a better way to manage your Store, you can find it in my [github] (https://github.com/ascoders) in-depth chat.
Should every component be Connected?
For the Mvvm idea library, the Connect concept goes beyond just injecting data (unlike redux) and listening for changes in the data trigger the rerender. So each component needs Connect?
Of course, the components that do not use the store manager not need Connect, but the business-logic related components remain uncertain in the future (business uncertainty), so maintaining Connect for each business component help to improve maintainability.
And Connect may also do other optimization work, such as dobConnect will not only inject data to complete the component automatically render, but also to ensure that the component's PureRender.
Actually, this issue is only a very small one. However, the reality is ironic. In many cases, we will be more tangled in this kind of small idea, so here's a brief discussion.
Whether the store manager should be flattened
Store flattening is largely due to lack of support for immutable js, resulting in very troublesome changes to the underlying data. Although libraries like immutable.js can be quickly manipulated via strings, but, however, this method of use is only temporary, we can not see the js standard recommends that we use the string to access the object properties.
Accessing object properties via strings is similar to lodash's _.get, but there are already [proposal-optional-chaining] (https://github.com/tc39/proposal-optional-chaining) The proposal is resolved at the grammar level, and the same immutable conveniences require a standard way of doing things. You do not actually have to wait for another proposal, using the existing capabilities of js to simulate the effects of native immutable support:
You can mutable, generate immutable data, and redux docking.
A bit far away, then the essence of store manager flattening is the data format specification issues. For example, [normalizr] (https://github.com/paularmstrong/normalizr) is a standard data specification, and many times we store redundant or misclassified data in the Store.
For the front-end data stream is thin, nor is it just finished processing data. There are many things to do, Such as the use of node microservices on the back-end data standardization package some standard format processing components, the thin data made of zero thickness, the business code can be completely without any perception of simple data flow and so on.
Asynchronous and side effects
Redux naturally use action to isolate the side effects and asynchrony, that in action-only Mvvm development model, how asynchronous should to be isolated? Is Mvvm the perfect solution to Redux's evasive asynchronous problem?
When using the dob framework, the assignment after asynchrony needs to be very careful:
@Action async getUserInfo() {
const userInfo = await fetchUser()
this.store.user.data = userInfo // Exceptions will be thrown in strict mode because they break away from the Action scope.
The reason is that await is asynchronous, just writing like synchronization. When an await starts, the stack of the current function has exited, so the subsequent code is not in an Action, so the general solution is to define Action:
This shows that asynchrony needs to be careful! Redux isolation asynchronous to the Reducer is correct, as long as the data flow changes involved in the operation is synchronized, how strange outside Action, Reducer can sit back and relax.
In fact, redux isolated asynchronous approach with the following code:
If you do not want to repeat the write Action, this isolation method is also a good choice.
Resend the request automatically
Another benefit of the responsive framework is that it can be triggered automatically, such as automatically triggering requests, triggering actions automatically, and more.
For example, we hope that when the request parameters change, it can automatically resend, in general, need to be written in react:
The magic is that the callback function is re-executed when the variable used by the observe callback changes. The componentWillReceiveProps make judgments, in fact, is to use the life cycle of react to manually monitor variables change, if you change the trigger request function. but this series of operations can be done by observe function.
observe something like a more automated addEventListener:
So do not forget to de-listen when the component is destroyed:
this.signal.unobserve()
Recently, our team is also exploring how to make more use of this feature and is considering implementing an automatic request library. If there are good suggestions, it is also very welcome to communicate with each other.
Type derivation
Type deduction is easier if you use a framework like dob or mobx:
Complex data flow must exist between Store and Action mutual reference, more recommended dependency injection approach, which is also one of dob respected good practice.
Of course, dependency injection can not be abused, for example, do not exist circular dependencies, although dependency injection usage is flexible, but before you write the code, you need to have a more complete data stream planning, such as simple users, articles, comments scenes, we can design data flow:
Create UserStoreArticleStoreReplyStore:
import { inject } from 'dob'
class UserStore {
users
}
class ReplyStore {
@inject(UserStore) userStore: UserStore
replys // each.user
}
class ArticleStore {
@inject(UserStore) userStore: UserStore
@inject(ReplyStore) replyStore: ReplyStore
articles // each.replys each.user
}
Each comment relates to the user information, so ReplyStore injected into the UserStore, each article contains author and comment information, so ArticleStore injected UserStore and ReplyStore, you can see the dependencies between the store should be a tree, not a ring.
The final Action on the operation of the Store is done by injection, and because the store has been injected into the End, Action can only operate the corresponding Store, when necessary, then inject additional Store, and there will be no circular dependencies:
class UserAction {
@inject(UserStore) userStore: UserStore
}
class ReplyAction {
@inject(ReplyStore) replyStore: ReplyStore
}
class ArticleAction {
@inject(ArticleStore) articleStore: ArticleStore
}
Finally, it is not advisable to inject the global Store into the local store, or to inject the local Action into the global store. This will destroy the fractal characteristics of the local data flow. It is necessary to ensure the independence of the non-business related components and bind global data flow to business-logic related components.
Action's error handling
A more elegant way is to write a class-level decorator that catches the exception of the Action and throws:
When any step triggers an exception, the code after await stops executing and reports the exception to the front-end monitoring platform.
4 In conclusion
To help solve issues under most development scenarios, one should accurately distinguish between business-logic related components and non-related components, design the data flow dependencies before writing the code. And pay attention to separation of asynchronous operations. Under special circumstances you can use the monitor to monitor data changes, thus extends to functionalities such as automatic request resend.
Although the data flow is only a very small part of the project, if you want to maintain a good maintainability of the entire project, you need to pay attention to all aspects mentioned above.
why would one use
proxy
to achieve mobx? Because mobx is great, butObject.defineProperty
make it uncomfortable to write.So dob was born, I will list below only changes unique to dob, as well as a summary of my experience on state management frameworks.
1 Introduction
dob not only overrides mobx using proxy, but also comes with dependency injection store management.
2 Different from Mobx
Api similar to Mobx:
Not needed
Not needed
native array
native Map & Set
What benefits does
proxy
bring?Map
Set
.computed
decorator forget()
.Here's an example:
3 store manager management issues
When using redux, a lot of time can not distinguish whether the structured data will be flattened, and then subscribe separately, or can not tell after the subscription data processing should be on the component or the overall.
This is because redux undermines the react fractal design, as discussed in "Recent Discussion Record."
Many redux-based fractal solutions are "pseudo" fractal, secretly use
replaceReducer
to do some dynamic reducer registration, and then bound to the global.However, frameworks like Mobx and dob are truly fractal, let's start with the question of how to manage the store manager.
How to manage store
The so-called best practices are based on a convention or constraint that makes code readable and maintainable. Agreed is flexible, non-compliance is okay, constraints are mandatory, you can not run without them. Most of the constraints provided by the framework, such as open strict mode, the prohibition of modification of variables outside the Action. However, the most entangled places are still conventions. I came up with a set of usage conventions that use for this kind of responsive store manager.
The use of store manager, the first thing to do is to manage the data, to solve where the Data store on, and whether it is necessary to use store manager.
Whether to use store
First and foremost, the simplest component certainly does not need store manager. Well, when the component is complex, if the data stream itself has a fractal function, then it's available. The store manager with fractal function, can avaliable a new component that combined with react and store manager, with fractal capability:
Dob is such a framework, in the above example, click on the text can trigger refresh, even if no
Provider
in this root DOM node. This means that this component, independent of any environment, can run as part of any project. Although this component uses the store manager, but no difference with the ordinary React components, you can rest assured that use.If it is a pseudo-fractal data stream,
ReactDOM.render
may require a specificProvider
to work with, then this component does not have the ability to migrate. If someone else unfortunately installed this component, you need to install a family bucket at the root of the project.Q: Although components with store manager have full fractal capabilities, there is an invisible dependency on it if this component responds to
observable
props.A: Yes, if a component requires that the received props be 'observable' in order to automatically rerender when it changes, that part of the component's functionality will expire when an environment passes normal props. In fact, props belongs to react's universal connection bridge, so the component should only rely on the props of ordinary objects, the internal can then 'observable' it to have a complete ability to migrate.
How to use store
Although React can be fully modularized, modules in actual projects must be divided into non-business related components and business-logic related components, and page modules can also be used as business-logic related components. Data-driven complex website is better, since it is data-driven, then the business-logic related components and data can be moved to the top management of the connection, usually through the top of the page package Provider implementation:
Just changed the position of the definition of the store, and the way components are used remains unchanged:
One difference is that @Connect does not need to be parameterized, because if
Provider
is registered globally, it will be passed through by default toConnect
. Contrary to fractal, this design can lead to components not being able to migrate to other projects, but the benefits are that they can be moved anywhere in this project.Fractal components are strongly dependent on the file structure, as long as the desired props are given the ability to do so, whereas the components of the global store manager are almost independent on the file structure and all the props are taken from the global store.
In fact, here, you can find these two points is difficult to merge into one, we can pre-divided into two components business and non-business coupling, business-logic related components rely on global store manager, so that non-related coupling components to maintain Fractal ability.
If you have a better way to manage your Store, you can find it in my [github] (https://github.com/ascoders) in-depth chat.
Should every component be Connected?
Of course, the components that do not use the store manager not need
Connect
, but the business-logic related components remain uncertain in the future (business uncertainty), so maintainingConnect
for each business component help to improve maintainability.And
Connect
may also do other optimization work, such as dobConnect
will not only inject data to complete the component automatically render, but also to ensure that the component'sPureRender
.Actually, this issue is only a very small one. However, the reality is ironic. In many cases, we will be more tangled in this kind of small idea, so here's a brief discussion.
Whether the store manager should be flattened
Store flattening is largely due to lack of support for immutable js, resulting in very troublesome changes to the underlying data. Although libraries like immutable.js can be quickly manipulated via strings, but, however, this method of use is only temporary, we can not see the js standard recommends that we use the string to access the object properties.
Accessing object properties via strings is similar to lodash's
_.get
, but there are already [proposal-optional-chaining] (https://github.com/tc39/proposal-optional-chaining) The proposal is resolved at the grammar level, and the same immutable conveniences require a standard way of doing things. You do not actually have to wait for another proposal, using the existing capabilities of js to simulate the effects of native immutable support:[dob-redux] (https://github.com/dobjs/dob-redux) connecting with
react-redux
can be done with a mutable wording like:You can mutable, generate immutable data, and redux docking.
A bit far away, then the essence of store manager flattening is the data format specification issues. For example, [normalizr] (https://github.com/paularmstrong/normalizr) is a standard data specification, and many times we store redundant or misclassified data in the Store.
For the front-end data stream is thin, nor is it just finished processing data. There are many things to do, Such as the use of node microservices on the back-end data standardization package some standard format processing components, the thin data made of zero thickness, the business code can be completely without any perception of simple data flow and so on.
Asynchronous and side effects
When using the dob framework, the assignment after asynchrony needs to be very careful:
The reason is that
await
is asynchronous, just writing like synchronization. When anawait
starts, the stack of the current function has exited, so the subsequent code is not in anAction
, so the general solution is to defineAction
:This shows that asynchrony needs to be careful! Redux isolation asynchronous to the Reducer is correct, as long as the data flow changes involved in the operation is synchronized, how strange outside Action, Reducer can sit back and relax.
In fact, redux isolated asynchronous approach with the following code:
If you do not want to repeat the write
Action
, this isolation method is also a good choice.Resend the request automatically
Another benefit of the responsive framework is that it can be triggered automatically, such as automatically triggering requests, triggering actions automatically, and more.
For example, we hope that when the request parameters change, it can automatically resend, in general, need to be written in react:
In dob such frameworks, the following code functions are equivalent:
The magic is that the callback function is re-executed when the variable used by the observe callback changes. The
componentWillReceiveProps
make judgments, in fact, is to use the life cycle of react to manually monitor variables change, if you change the trigger request function. but this series of operations can be done byobserve
function.observe
something like a more automatedaddEventListener
:So do not forget to de-listen when the component is destroyed:
Recently, our team is also exploring how to make more use of this feature and is considering implementing an automatic request library. If there are good suggestions, it is also very welcome to communicate with each other.
Type derivation
Type deduction is easier if you use a framework like dob or mobx:
Store how to refer to each other
Complex data flow must exist between Store and Action mutual reference, more recommended dependency injection approach, which is also one of dob respected good practice.
Of course, dependency injection can not be abused, for example, do not exist circular dependencies, although dependency injection usage is flexible, but before you write the code, you need to have a more complete data stream planning, such as simple users, articles, comments scenes, we can design data flow:
Create
UserStore
ArticleStore
ReplyStore
:Each comment relates to the user information, so
ReplyStore
injected into theUserStore
, each article contains author and comment information, soArticleStore
injectedUserStore
andReplyStore
, you can see the dependencies between the store should be a tree, not a ring.The final Action on the operation of the Store is done by injection, and because the store has been injected into the End, Action can only operate the corresponding Store, when necessary, then inject additional Store, and there will be no circular dependencies:
Finally, it is not advisable to inject the global Store into the local store, or to inject the local Action into the global store. This will destroy the fractal characteristics of the local data flow. It is necessary to ensure the independence of the non-business related components and bind global data flow to business-logic related components.
Action's error handling
A more elegant way is to write a class-level decorator that catches the exception of the Action and throws:
When any step triggers an exception, the code after await stops executing and reports the exception to the front-end monitoring platform.
4 In conclusion
To help solve issues under most development scenarios, one should accurately distinguish between business-logic related components and non-related components, design the data flow dependencies before writing the code. And pay attention to separation of asynchronous operations. Under special circumstances you can use the monitor to monitor data changes, thus extends to functionalities such as automatic request resend.
Although the data flow is only a very small part of the project, if you want to maintain a good maintainability of the entire project, you need to pay attention to all aspects mentioned above.
Happy hacking with dob.