metricq / aiocouch

🛋 An asynchronous client library for CouchDB 2.x and 3.x
https://aiocouch.readthedocs.io/en/latest/
BSD 3-Clause "New" or "Revised" License
29 stars 10 forks source link

feat bulk operation: don't crash on pristine document #32

Closed H--o-l closed 3 years ago

H--o-l commented 3 years ago

Database.update_docs(ids=[...]) operation was working without crash only if all documents were either modified or created. But if only a subset of documents are really modified inside the bulk operation it crashed with the error:

File "/var/coatl/lib64/python3.8/site-packages/aiocouch/bulk.py", line 59, in __aexit__
    assert status["id"] == doc.id
AssertionError

I think it's cool to improve it because: If you have foo and baz documents in your db, and want to update both, but foo already have the update. It's a nice shortcut to update both without having to exclude foo from the bulk operation (see unit test for an example).

codecov[bot] commented 3 years ago

Codecov Report

Merging #32 (240eb62) into master (510260c) will increase coverage by 0.00%. The diff coverage is 100.00%.

Impacted file tree graph

@@           Coverage Diff           @@
##           master      #32   +/-   ##
=======================================
  Coverage   95.93%   95.93%           
=======================================
  Files          11       11           
  Lines         762      763    +1     
=======================================
+ Hits          731      732    +1     
  Misses         31       31           
Impacted Files Coverage Δ
aiocouch/bulk.py 100.00% <100.00%> (ø)

Continue to review full report at Codecov.

Legend - Click here to learn more Δ = absolute <relative> (impact), ø = not affected, ? = missing data Powered by Codecov. Last update 510260c...240eb62. Read the comment docs.

H--o-l commented 3 years ago

Thanks @bmario for the quick review and merge! Is it possible for you to upload a new release with this change in a "not to far" future?

bmario commented 3 years ago

Done.

H--o-l commented 3 years ago

Thanks!