Open greenkeeper[bot] opened 7 years ago
Update to this version instead π
The new version differs by 49 commits.
f12608e
travis deploy on tags
68fd7cf
1.4.0
c6baae4
Comments
c184195
Upgrade to latest dependencies (#136)
a95673c
Replace ES6 export with node export (#135)
cc0c62b
Export flow types
d7e918c
Add example for Google Datastore (#131)
49e2659
Add support for browser environments (#134)
7404ab2
add elixir implementation of dataloader (#129)
f047061
Add dataloader in golang. (#123)
1a08f2a
Use travis stages (#133)
21b8416
Add Code of Conduct (#119)
330264a
Grammatical fixes (#120)
dc3f0cd
Validate shape of cacheMap on initialization (#86)
a98c896
add java-dataloder to readme (#110)
There are 49 commits in total.
See the full diff
dependency
dataloader was updated from 1.2.0
to 2.0.0
.Update to this version instead π
This is the first release since becoming part of the GraphQL Foundation and the most significant since the initial release over four years ago. Read more about the history of the project and this release in the blog post.
Breaking:
.loadMany()
now returns an array which may contain Error
if one of the requested keys failed.
Previously
.loadMany()
was exactly the same as callingPromise.all()
on multiple.load()
calls. While syntactically a minor convenience, this wasn't particularly useful over what could be done withPromise.all
directly and if one key failed, it meant the entire call to.loadMany()
would fail. As of this version,.loadMany()
can now return a mix of values andError
instances in the case that some keys failed, but the Promise it returns will never be rejected. This is similar to the behavior of the new Promise.allSettled method in the upcoming version of JavaScript.This will break any code which relied on
.loadMany()
. To support this change, either ensure the each item in the result of.loadMany()
are checked againstinstanceof Error
or replace calls likeloader.loadMany([k1, k2])
withPromise.all([loader.load(k1), loader.load(k2))
.
batchLoadFn
when { batch: false }
has changed to the end of the run-loop tick.
Previously when batching was disabled the
batchLoadFn
would be called immediately when.load()
is called. This differed from thebatchLoadFn
being called at the end of the tick of the run-loop for when batching was enabled. This timing difference could lead to subtle race conditions for code which dynamically toggled batching on or off. As a simplification, thebatchLoadFn
is now always called at the end of the run-loop tick regardless of whether batching is disabled.Hopefully this will not break your code. It could cause issues for any code which relied on this synchronous call to
batchLoadFn
for loaders where batching was disabled.
Previously when
.load()
encountered a cached value it would return an already resolved (or rejected) Promise. However when additional dependent loads happened after these, the difference in time between the cache hit value resolving and the cache miss value resolving would result in additional unnecessary network requests. As of this version when.load()
encounters a cached value it returns a Promise which waits to resolve until the call tobatchLoadFn
also resolves. This should result in better whole-program performance and is the most significant conceptual change and improvement. This is actually not a new innovation but a correction to match the original behavior of Facebook's "Loader" from 2010 this library is inspired by.This changes the timing of when Promises are resolved and thus could introduce subtle behavioral change in your code, especially if your code is prone to race conditions. Please test carefully.
This also means each return of
.load()
is a new Promise instance. Where prior versions returned the same Promise instance for cached results, this version does not. This may break code which uses the returned Promise as a memoization key or in some other way assumed reference equality.
This really shouldn't break your code because you definitely don't reach into class private variables, right? I just figured it would be something you'd like to know, you know... just in case.
New:
this
in batchLoadFn
The dirty secret of DataLoader is that most of it is quite boring. The interesting bit is the batch scheduling function which takes advantage of Node.js's unique run-loop scheduler to acheive automatic batching without any additional latency. However since its release, ports to other languages have found this bit to be not be easily replicated and have either replaced it with something conceptually simpler (like manual dispatch) or with a scheduler custom fit to a GraphQL execution engine. These are interesting innovations which deserve ground for experimentation in this original library as well.
Via
batchScheduleFn
, you can now provide a custom batch scheduling function and experiment with manual dispatch, added latency dispatch, or any other behavior which might work best for your application.
Types:
cacheKeyFn
and cacheMap
batchLoadFn
to return a PromiseLike
, supporting use of bluebirdbatchLoadFn
to return ArrayLike
, supporting returning read-only arraysError
to .prime()
Fixes:
Error
to .prime()
could incorrectly cause an unhandled promise rejection warningDocumentation:
cacheMap
along with an LRU example.batchLoadFn
.batchLoadFn
.The new version differs by 48 commits.
0c05d28
2.0.0
6592183
Support custom schedulers (#228)
3b2192c
Add disclaimer for video walkthrough
01c829d
Resolve all options during construction (#226)
cde0cd8
Update issue templates (#227)
06c403b
[BREAKING] Resolve cached values after batch dispatch (#222)
b5d7bf5
Refactor batching logic (#220)
a3dd591
[BREAKING] loadMany() returns individual Error instead of rejecting promise. (#216)
2f7af56
Add example in documentation for converting object results to Array results.
923a75f
Add example for freezing results with a HOF
200b522
Merge both custom cache documentations together
23bd362
Add documentation for custom cacheMap field.
29811db
[FIX] Fix case where priming a cache with an Error results in UnhandledPromiseRejection (#223)
4212c9e
Add tests for behavior when process.nextTick does not exist (#221)
44977c0
[FIX] Returns the keys type of the batch fn to ReadonlyArray (#219)
There are 48 commits in total.
See the full diff
Version 1.3.0 of dataloader just got published.
The version 1.3.0 is not covered by your current version range.
Without accepting this pull request your project will work just like it did before. There might be a bunch of new features, fixes and perf improvements that the maintainers worked on for you though.
I recommend you look into these changes and try to get onto the latest version of dataloader. Given that you have a decent test suite, a passing build is a strong indicator that you can take advantage of these changes by merging the proposed change into your project. Otherwise this branch is a great starting point for you to work on the update.
Release Notes
v1.3.0New:
Thanks to contributions from many, the documentation for DataLoader is now significantly better, with portions of
README.md
reworked and improved and more information inexamples/
.DataLoader installs now come with both TypeScript and Flow type definitions. (#43, #45)
Batch loads can now be limited to a certain number of keys (#42)
Commits
The new version differs by 29 commits .
d472a69
1.3.0
f005123
Add Readme and tests about cache disabling (#72)
fcd9dd7
Merge branch 'kouak-master'
26d0b59
Merge branch 'master' of https://github.com/kouak/dataloader into kouak-master
6521f54
Drop explicit support for non-maintained node versions
5e81cdc
Break out examples to a separate directory.
a8af693
More minor edits to Readme
ace12af
Add more explicit language about per-request caching to Readme.
dab8a70
Improve readme about cached errors.
67dbc7a
Improve documentation of the expectations of a Batch Function.
eec8bdf
Use
export =
for compat withmodule.exports =
(#51)8c7999a
Fix LoadMany empty array test (#57)
5ff6f86
Add a RethinkDb example, avoiding pitfalls
da2fb5f
unbreak travis
f119438
add regenerator to unbreak older tests
There are 29 commits in total. See the full diff.
Not sure how things should work exactly?
There is a collection of [frequently asked questions](https://greenkeeper.io/faq.html) and of course you may always [ask my humans](https://github.com/greenkeeperio/greenkeeper/issues/new).Your Greenkeeper Bot :palm_tree: