xros / jsonpyes

The tool which imports raw JSON to ElasticSearch in one line of commands
Other
67 stars 21 forks source link

How to pass timeout option #29

Open ranvirgorai opened 7 years ago

ranvirgorai commented 7 years ago
Traceback (most recent call last):
  File "/usr/local/bin/jsonpyes", line 588, in <module>
    run()
  File "/usr/local/bin/jsonpyes", line 446, in run
    body=json.loads(line)
  File "/usr/local/lib/python2.7/dist-packages/elasticsearch/client/utils.py", line 73, in _wrapped
    return func(*args, params=params, **kwargs)
  File "/usr/local/lib/python2.7/dist-packages/elasticsearch/client/__init__.py", line 298, in index
    _make_path(index, doc_type, id), params=params, body=body)
  File "/usr/local/lib/python2.7/dist-packages/elasticsearch/transport.py", line 312, in perform_request
    status, headers, data = connection.perform_request(method, url, params, body, ignore=ignore, timeout=timeout)
  File "/usr/local/lib/python2.7/dist-packages/elasticsearch/connection/http_urllib3.py", line 123, in perform_request
    raise ConnectionError('N/A', str(e), e)
elasticsearch.exceptions.ConnectionError: ConnectionError(('Connection aborted.', error(104, 'Connection reset by peer'))) caused by: ProtocolError(('Connection aborted.', error(104, 'Connection reset by peer')))

Getting Error, file is about to 450 MB

xros commented 6 years ago

That looks like an error from you ES server. Be sure to set the timeout of the ES longer. Probably it took lots of time when instantiating your JSON data. Try to set the timeout of the ES server before losing connection. Is the piece (line) of your json data very large at that moment?