barseghyanartur / graphene-elastic

Graphene Elasticsearch/OpenSearch (DSL) integration
https://pypi.org/project/graphene-elastic/
71 stars 17 forks source link

RELAY_CONNECTION_MAX_LIMIT not being respected #38

Closed coreyculler closed 4 years ago

coreyculler commented 4 years ago

Hello, I am trying to raise the maximum limit for the first variable. I added RELAY_CONNECTION_MAX_LIMIT to my GRAPHENE setting in Django. The override works for database GraphQL queries but if you try to request more than 100 items for an ElasticsearchObjectType it will error out and report that the requested number of items exceeds the limit.

I've been able to correct this by modifying the default value in settings.py. It appears that the issue may be related to the IMPORT_STRINGS variable in your settings.py.

coreyculler commented 4 years ago

If you don't have the time to fix this, I can attempt a fix and submit a PR.

barseghyanartur commented 4 years ago

@coreyculler:

I would appreciate it if you could come up with something. I'm now mainly concentrated on the roadmap and milestones.

barseghyanartur commented 4 years ago

@coreyculler:

FYI, I can't reproduce it with my current setup.

This is what I have in local_overrides.py (located here):

DEFAULTS = {
    "SCHEMA": None,
    "SCHEMA_OUTPUT": "schema.json",
    "SCHEMA_INDENT": 2,
    # "MIDDLEWARE": (),
    # Set to True if the connection fields must have
    # either the first or last argument
    "RELAY_CONNECTION_ENFORCE_FIRST_OR_LAST": False,
    # Max items returned in ConnectionFields / FilterConnectionFields
    "RELAY_CONNECTION_MAX_LIMIT": 200,
    "LOGGING_LEVEL": logging.DEBUG,
}

Number of results I get from Elastic is equal to 200 (just tested).

coreyculler commented 4 years ago

I realized after I investigated a little further that I was mistaken in thinking that the library was pulling settings from Django. Once I created a GRAPHENE_ELASTIC environment variable with JSON that overrode that variable it started working. I'm closing the issue since it was an error on my part.