Closed bradwbonn closed 8 years ago
This probably has to do with the fact that we are missing the "encoder" piece here. This bug can probably be lumped under the same umbrella as #170.
fwiw, I can do a db.all_docs(include_docs=True, keys=['foo', 'bar', 'baz'])
and it works fine so I am guessing it has something to do with your keys list?? Not an excuse, but more of a statement of "it sort of" works but clearly there is a bug that needs to be fleshed out and resolved. Like I said I think it may have to do with the lack of encoding in the json.dumps()
of the keys parameter.
Sounds about right. Would it help if I shared my "keys" array? It's 2000 entries. (The recommended object size limit for bulk transactions against Cloudant.)
Yeah that would help a lot actually that way whoever is tasked with resolving will have the right list to go against.
Is file_doc_batch
content being pulled from a file and into a dictionary?
file_doc_batch
is a dict()
built from iterating through a local filesystem and storing metadata about each file. The key is a custom _id field built off of that metadata, and the value is the metadata itself.
The idea being that the app can check the contents of the local filesystem against the database directly by an ID on the primary index.
This may be a true Heisenbug type situation. I just used your full list of of keys against a database that obviously does not contain any of those matching keys and I got the expected set of 2000 of these: {u'error': u'not_found', u'key': u'27b818b97c59390724a7e4ab58124b18e3313f711444686540'},
I suppose this a good and bad thing. The good is that all_docs
with keys does appear to work in some form. The bad is that it does not work for you @bradwbonn and also that it looks like it might be difficult to replicate. In either case, I think the encoder still needs to be added and hopefully that resolves the problem.
That might explain why at one point, I could have sworn it was working, but today when I tried the "new" method, it gave me the error. Something weird thisway comes... Would debug logging from the library tell us anything that might help?
The suggested addition of the encoder has been done as part of #170 / #185 commit ec369d597c5152091546111f5090dd1e6f326a67
The encoder doesn't appear to fix this problem it looks like we're missing a Content-Type
header when we POST
the list of keys. Not sure why it works in some cases e.g. https://github.com/cloudant/python-cloudant/issues/177#issuecomment-223103966.
The difference in behaviours appears to be down to the server side handling of the POST
content. It appears that CouchDB 1.6 running local for example is happy to proceed without the Content-Type
header; just treating the POSTed keys content as JSON. On the other hand, the load balancers in front of the Cloudant service will reject the request with a 415 if the Content-Type: application/json
header is not present.
The fix is to make sure we add the Content-Type
header to the request.
I can POST a list of keys as JSON data to the requests library, but on the same DB using the Cloudant python library, the call fails with an unsupported media type error. Is it not specifying the content-type properly?
Below is the code. old_method() uses requests, new_method() uses python-cloudant.
old_method() returns JSON from the _all_docs endpoint with the expected 'rows' array for each key included.
new_method() errors with HTTP 415: