cloudant / python-cloudant

A Python library for Cloudant and CouchDB
Apache License 2.0
163 stars 55 forks source link

A few questions #483

Closed Mradr closed 4 years ago

Mradr commented 4 years ago

Hello, I have a few questions. 1) Is there a way to talk directly to the database instead of the local cache? Some of my "documents" can be quite large in the upper 30-40MB of data with many entries. The problem is I am also using a class around the data for function base reasons. What would be the proper way to use the database without it loading all the data to local memory but still be able to access and do stuff with in the database correctly?

2) Traceback (most recent call last): File "F:\old_drive\Version-Beta-3\ticket_manager.py", line 288, in remove self.ticket_db[id].remove() File "F:\old_drive\Version-Beta-3\ticket_manager.py", line 427, in remove self.app.master_database["tickets"][self.id].delete() File "C:\python37\lib\site-packages\cloudant\document.py", line 316, in delete del_resp.raise_for_status() File "C:\python37\lib\site-packages\requests\models.py", line 939, in raise_for_status raise HTTPError(http_error_msg, response=self) requests.exceptions.HTTPError: 409 Client Error: Conflict conflict Document update conflict. for url: *****

What is more than likely causing this error?

ricellis commented 4 years ago

Is there a way to talk directly to the database instead of the local cache?

There is some discussion in https://github.com/cloudant/python-cloudant/issues/411 about this and some options for keeping the cache at a smaller size.

We do have a new Python SDK (in beta) that does not have a dict cache layer and I'd recommend using that if you want to avoid the local cache approach entirely.

requests.exceptions.HTTPError: 409 Client Error: Conflict conflict Document update conflict. for url: ***** What is more than likely causing this error?

The most likely cause of this error is that the revision in the cached object is not up-to-date with the server revision. You could fetch() the doc again before deleting or because you are deleting anyway and your docs sound large you might just want to get the latest rev with a HEAD request and update the local doc before calling delete something like this:

# your document
doc = db['yourdoc']
# get the latest _rev from a head request
latest_rev = client_db.r_session.head(doc.document_url).headers['ETag'].strip('"')
# update the local doc with the new _rev
doc['_rev'] = latest_rev
doc.delete()