Beit-Hatfutsot / mojp-dbs-pipelines

pipelines for data sync of Jewish data sources to the DB of the muesum of the Jewish people
MIT License
0 stars 2 forks source link

research elasticsearch ConnectionTimeout error when docker is running for a long time #27

Open OriHoch opened 7 years ago

OriHoch commented 7 years ago

@Libisch commented on Sun Jul 02 2017

With all tests passing and pipelines running, process fails. See log:

datapackage_pipelines_mojp.common.processors.sync: ERROR:sync:unexpected exception, resource_descirptor={'name': 'dbs_docs_sync_log', 'path': 'dbs_docs_sync_log.csv', 'schema': {'fields': [{'name': 'source', 'type': 'string'}, {'name': 'id', 'type': 'string'}, {'name': 'version', 'type': 'string', 'description': 'source dependant field, used by sync process to detect document updates'}, {'name': 'collection', 'type': 'string', 'description': 'standard collection identifier (e.g. places / familyNames etc..). must be related to one of the COLLECTION_? constants'}, {'name': 'sync_msg', 'type': 'string'}]}}, row={'collection': 'familyNames', 'content_html': {}, 'content_html_en': 'BIEGELEISEN

Surnames derive from one of many different origins. Sometimes there may be more than one explanation for the same name. This family name derives from an occupation, profession or trade (also connected with raw material, finished product or implements associated with that trade).

Biegeleisen is a form of Buegeleisen, the German for "flat iron/clothes iron", one of the tools used by tailors.

Distinguished bearers of the Jewish surname Biegeleisen include the 19th/20th century Galician-born Polish literary historian and folklorist Henryk Biegeleisen.', 'content_html_he': 'BIEGELEISEN

שמות משפחה נובעים מכמה מקורות שונים. לעיתים לאותו שם קיים יותר מהסבר אחד. שם משפחה זה נגזר מעיסוק, מקצוע או מסחר (יכול להיות קשור לחומרי הגלם, המוצר המוגמר או כלי העבודה המשתייכים לאותו עיסוק).

השם ביגלאייזן הוא גרסה של המילה הגרמנית בוגלאייזן שפירושה "מגהץ, מן הסוג שחייטים נהגו להשתמש בו.

אישים מוכרים בעלי שם המשפחה היהודי ביגלאייזן כוללים את הנריק ביגלאייזן, חוקר פולקלור והסטוריון של תולדות הספרות פולחני אשר נולד בגליציה בסוף המאה ה-19.', 'id': '140153', 'main_image_url': '', 'main_thumbnail_image_url': '', 'source': 'clearmash', 'source_doc': {'changeset': 4709094, 'collection': 'familyNames', 'document_id': 'd2d28936e9e74428917fddeb2afa84c8', 'item_id': 140153, 'item_url': 'http://bh.clearmash.com/skn/c6/dummy/e140153/dummy/', 'metadata': {'ActiveLock': None, 'CreatorUserId': 2, 'EntityTypeId': 1009, 'IsArchived': False, 'IsBookmarked': False, 'IsLiked': False, 'LikesCount': 0}, 'parsed_doc': {'EntityFirstPublishDate': ['Datetime', {'Id': 'EntityFirstPublishDate', 'Value': {'UtcTicks': 636297720504199936}}], 'EntityLastPublishDate': ['Datetime', {'Id': 'EntityLastPublishDate', 'Value': {'UtcTicks': 636297720504199936}}], 'EntityViewsCount': 0, '_c6_beit_hatfutsot_bh_base_template_bhp_unit': {'en': '91579'}, '_c6_beit_hatfutsot_bh_base_template_description': {'en': 'BIEGELEISEN

Surnames derive from one of many different origins. Sometimes there may be more than one explanation for the same name. This family name derives from an occupation, profession or trade (also connected with raw material, finished product or implements associated with that trade).

Biegeleisen is a form of Buegeleisen, the German for "flat iron/clothes iron", one of the tools used by tailors.

Distinguished bearers of the Jewish surname Biegeleisen include the 19th/20th century Galician-born Polish literary historian and folklorist Henryk Biegeleisen.', 'he': 'BIEGELEISEN

שמות משפחה נובעים מכמה מקורות שונים. לעיתים לאותו שם קיים יותר מהסבר אחד. שם משפחה זה נגזר מעיסוק, מקצוע או מסחר (יכול להיות קשור לחומרי הגלם, המוצר המוגמר או כלי העבודה המשתייכים לאותו עיסוק).

השם ביגלאייזן הוא גרסה של המילה הגרמנית בוגלאייזן שפירושה "מגהץ, מן הסוג שחייטים נהגו להשתמש בו.

אישים מוכרים בעלי שם המשפחה היהודי ביגלאייזן כוללים את הנריק ביגלאייזן, חוקר פולקלור והסטוריון של תולדות הספרות פולחני אשר נולד בגליציה בסוף המאה ה-19.'}, '_c6_beit_hatfutsot_bh_base_template_display_status': [{'en': 'Museum only', 'he': 'מוזיאון בלבד'}], '_c6_beit_hatfutsot_bh_base_template_editor_remarks': {'en': 'hasavot from Family Names'}, '_c6_beit_hatfutsot_bh_base_template_family_name': ['RelatedDocuments', {'FirstPageOfReletedDocumentsIds': [], 'FirstPageParams_ArchiveFilter': 2, 'FirstPageParams_ReverseOrder': False, 'Id': '_c6_beit_hatfutsot_bh_base_template_family_name', 'TotalItemsCount': 0}], '_c6_beit_hatfutsot_bh_base_template_last_updated_date_bhp': ['Datetime', {'Id': '_c6_beit_hatfutsot_bh_base_template_last_updated_date_bhp', 'Value': {'UtcTicks': 634898676600000000}}], '_c6_beit_hatfutsot_bh_base_template_multimedia_movies': ['RelatedDocuments', {'FirstPageOfReletedDocumentsIds': [], 'FirstPageParams_ArchiveFilter': 2, 'FirstPageParams_ReverseOrder': False, 'Id': '_c6_beit_hatfutsot_bh_base_template_multimedia_movies', 'TotalItemsCount': 0}], '_c6_beit_hatfutsot_bh_base_template_multimedia_music': ['RelatedDocuments', {'FirstPageOfReletedDocumentsIds': [], 'FirstPageParams_ArchiveFilter': 2, 'FirstPageParams_ReverseOrder': False, 'Id': '_c6_beit_hatfutsot_bh_base_template_multimedia_music', 'TotalItemsCount': 0}], '_c6_beit_hatfutsot_bh_base_template_multimedia_photos': ['RelatedDocuments', {'FirstPageOfReletedDocumentsIds': [], 'FirstPageParams_ArchiveFilter': 2, 'FirstPageParams_ReverseOrder': False, 'Id': '_c6_beit_hatfutsot_bh_base_template_multimedia_photos', 'TotalItemsCount': 0}], '_c6_beit_hatfutsot_bh_base_template_old_numbers_parent': ['ChildDocuments', {'ChildDocuments': [{'AllowedGroups': [], 'Value': {'Fields_Boolean': [], 'Fields_ChildDocuments': [], 'Fields_Datasource': [], 'Fields_Datetime': [], 'Fields_Files': [], 'Fields_Float32': [], 'Fields_Float64': [], 'Fields_FuzzyDate': [], 'Fields_FuzzyDateRange': [], 'Fields_Int32': [], 'Fields_Int64': [], 'Fields_LocalizedHtml': [], 'Fields_LocalizedText': [{'Id': '_c6_beit_hatfutsot_bh_base_template_file_child_old_num_fid', 'Value': [{'ISO6391': 'en', 'Value': 'FM001577.HTM'}]}], 'Fields_MediaGalleries': [], 'Fields_RelatedDocuments': [], 'Id': 'db730e0783e84ab58b91f5571e08ab7d', 'TemplateReference': {'ChangesetId': 3537894, 'TemplateId': '_c6_beit_hatfutsot_bh_base_template_old_number_child'}}}], 'Id': '_c6_beit_hatfutsot_bh_base_template_old_numbers_parent'}], '_c6_beit_hatfutsot_bh_base_template_related_exhibition': ['RelatedDocuments', {'FirstPageOfReletedDocumentsIds': [], 'FirstPageParams_ArchiveFilter': 2, 'FirstPageParams_ReverseOrder': False, 'Id': '_c6_beit_hatfutsot_bh_base_template_related_exhibition', 'TotalItemsCount': 0}], '_c6_beit_hatfutsot_bh_base_template_related_musictext': ['RelatedDocuments', {'FirstPageOfReletedDocumentsIds': [], 'FirstPageParams_ArchiveFilter': 2, 'FirstPageParams_ReverseOrder': False, 'Id': '_c6_beit_hatfutsot_bh_base_template_related_musictext', 'TotalItemsCount': 0}], '_c6_beit_hatfutsot_bh_base_template_related_personality': ['RelatedDocuments', {'FirstPageOfReletedDocumentsIds': [], 'FirstPageParams_ArchiveFilter': 2, 'FirstPageParams_ReverseOrder': False, 'Id': '_c6_beit_hatfutsot_bh_base_template_related_personality', 'TotalItemsCount': 0}], '_c6_beit_hatfutsot_bh_base_template_related_place': ['RelatedDocuments', {'FirstPageOfReletedDocumentsIds': [], 'FirstPageParams_ArchiveFilter': 2, 'FirstPageParams_ReverseOrder': False, 'Id': '_c6_beit_hatfutsot_bh_base_template_related_place', 'TotalItemsCount': 0}], '_c6_beit_hatfutsot_bh_base_template_related_recieve_unit': ['RelatedDocuments', {'FirstPageOfReletedDocumentsIds': [], 'FirstPageParams_ArchiveFilter': 2, 'FirstPageParams_ReverseOrder': False, 'Id': '_c6_beit_hatfutsot_bh_base_template_related_recieve_unit', 'TotalItemsCount': 0}], '_c6_beit_hatfutsot_bh_base_template_rights': [{'en': 'Full', 'he': 'מלא'}], '_c6_beit_hatfutsot_bh_base_template_source': ['RelatedDocuments', {'FirstPageOfReletedDocumentsIds': [], 'FirstPageParams_ArchiveFilter': 2, 'FirstPageParams_ReverseOrder': False, 'Id': '_c6_beit_hatfutsot_bh_base_template_source', 'TotalItemsCount': 0}], '_c6_beit_hatfutsot_bh_base_template_ugc': False, '_c6_beit_hatfutsot_bh_base_template_working_status': [{'en': 'Completed', 'he': 'הסתיים'}], 'community_id': 6, 'entity_creation_date': ['Datetime', {'Id': 'entity_creation_date', 'Value': {'UtcTicks': 636297720504199936}}], 'entity_has_pending_changes': False, 'entity_id': 140153, 'entity_name': {'en': 'BIEGELEISEN', 'he': 'ביגלאייזן'}, 'entity_type_id': 1009, 'is_archived': False, 'is_deleted': False}, 'template_changeset_id': 3537894, 'template_id': '_c6_beit_hatfutsot_bh_family_name'}, 'title': {}, 'title_en': 'BIEGELEISEN', 'title_he': 'ביגלאייזן', 'version': '4709094-d2d28936e9e74428917fddeb2afa84c8'}
datapackage_pipelines_mojp.common.processors.sync: Traceback (most recent call last):
datapackage_pipelines_mojp.common.processors.sync:   File "/usr/local/lib/python3.6/site-packages/urllib3/connectionpool.py", line 386, in _make_request
datapackage_pipelines_mojp.common.processors.sync:     six.raise_from(e, None)
datapackage_pipelines_mojp.common.processors.sync:   File "", line 2, in raise_from
datapackage_pipelines_mojp.common.processors.sync:   File "/usr/local/lib/python3.6/site-packages/urllib3/connectionpool.py", line 382, in _make_request
datapackage_pipelines_mojp.common.processors.sync:     httplib_response = conn.getresponse()
datapackage_pipelines_mojp.common.processors.sync:   File "/usr/local/lib/python3.6/http/client.py", line 1331, in getresponse
datapackage_pipelines_mojp.common.processors.sync:     response.begin()
datapackage_pipelines_mojp.common.processors.sync:   File "/usr/local/lib/python3.6/http/client.py", line 297, in begin
datapackage_pipelines_mojp.common.processors.sync:     version, status, reason = self._read_status()
datapackage_pipelines_mojp.common.processors.sync:   File "/usr/local/lib/python3.6/http/client.py", line 258, in _read_status
datapackage_pipelines_mojp.common.processors.sync:     line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1")
datapackage_pipelines_mojp.common.processors.sync:   File "/usr/local/lib/python3.6/socket.py", line 586, in readinto
datapackage_pipelines_mojp.common.processors.sync:     return self._sock.recv_into(b)
datapackage_pipelines_mojp.common.processors.sync: socket.timeout: timed out
datapackage_pipelines_mojp.common.processors.sync: During handling of the above exception, another exception occurred:
datapackage_pipelines_mojp.common.processors.sync: Traceback (most recent call last):
datapackage_pipelines_mojp.common.processors.sync:   File "/usr/local/lib/python3.6/site-packages/elasticsearch/connection/http_urllib3.py", line 114, in perform_request
datapackage_pipelines_mojp.common.processors.sync:     response = self.pool.urlopen(method, url, body, retries=False, headers=self.headers, **kw)
datapackage_pipelines_mojp.common.processors.sync:   File "/usr/local/lib/python3.6/site-packages/urllib3/connectionpool.py", line 649, in urlopen
datapackage_pipelines_mojp.common.processors.sync:     _stacktrace=sys.exc_info()[2])
datapackage_pipelines_mojp.common.processors.sync:   File "/usr/local/lib/python3.6/site-packages/urllib3/util/retry.py", line 333, in increment
datapackage_pipelines_mojp.common.processors.sync:     raise six.reraise(type(error), error, _stacktrace)
datapackage_pipelines_mojp.common.processors.sync:   File "/usr/local/lib/python3.6/site-packages/urllib3/packages/six.py", line 686, in reraise
datapackage_pipelines_mojp.common.processors.sync:     raise value
datapackage_pipelines_mojp.common.processors.sync:   File "/usr/local/lib/python3.6/site-packages/urllib3/connectionpool.py", line 600, in urlopen
datapackage_pipelines_mojp.common.processors.sync:     chunked=chunked)
datapackage_pipelines_mojp.common.processors.sync:   File "/usr/local/lib/python3.6/site-packages/urllib3/connectionpool.py", line 388, in _make_request
datapackage_pipelines_mojp.common.processors.sync:     self._raise_timeout(err=e, url=url, timeout_value=read_timeout)
datapackage_pipelines_mojp.common.processors.sync:   File "/usr/local/lib/python3.6/site-packages/urllib3/connectionpool.py", line 308, in _raise_timeout
datapackage_pipelines_mojp.common.processors.sync:     raise ReadTimeoutError(self, url, "Read timed out. (read timeout=%s)" % timeout_value)
datapackage_pipelines_mojp.common.processors.sync: urllib3.exceptions.ReadTimeoutError: HTTPConnectionPool(host='192.168.170.195', port=9200): Read timed out. (read timeout=10)
datapackage_pipelines_mojp.common.processors.sync: During handling of the above exception, another exception occurred:
datapackage_pipelines_mojp.common.processors.sync: Traceback (most recent call last):
datapackage_pipelines_mojp.common.processors.sync:   File "/mojp/datapackage_pipelines_mojp/common/processors/sync.py", line 135, in _filter_row
datapackage_pipelines_mojp.common.processors.sync:     return self._update_doc(new_doc, old_doc)
datapackage_pipelines_mojp.common.processors.sync:   File "/mojp/datapackage_pipelines_mojp/common/processors/sync.py", line 70, in _update_doc
datapackage_pipelines_mojp.common.processors.sync:     body={"doc": new_doc})
datapackage_pipelines_mojp.common.processors.sync:   File "/usr/local/lib/python3.6/site-packages/elasticsearch/client/utils.py", line 73, in _wrapped
datapackage_pipelines_mojp.common.processors.sync:     return func(*args, params=params, **kwargs)
datapackage_pipelines_mojp.common.processors.sync:   File "/usr/local/lib/python3.6/site-packages/elasticsearch/client/__init__.py", line 525, in update
datapackage_pipelines_mojp.common.processors.sync:     doc_type, id, '_update'), params=params, body=body)
datapackage_pipelines_mojp.common.processors.sync:   File "/usr/local/lib/python3.6/site-packages/elasticsearch/transport.py", line 312, in perform_request
datapackage_pipelines_mojp.common.processors.sync:     status, headers, data = connection.perform_request(method, url, params, body, ignore=ignore, timeout=timeout)
datapackage_pipelines_mojp.common.processors.sync:   File "/usr/local/lib/python3.6/site-packages/elasticsearch/connection/http_urllib3.py", line 122, in perform_request
datapackage_pipelines_mojp.common.processors.sync:     raise ConnectionTimeout('TIMEOUT', str(e), e)
datapackage_pipelines_mojp.common.processors.sync: elasticsearch.exceptions.ConnectionTimeout: ConnectionTimeout caused by - ReadTimeoutError(HTTPConnectionPool(host='192.168.170.195', port=9200): Read timed out. (read timeout=10))
datapackage_pipelines_mojp.common.processors.sync: Traceback (most recent call last):
datapackage_pipelines_mojp.common.processors.sync:   File "/usr/local/lib/python3.6/site-packages/urllib3/connectionpool.py", line 386, in _make_request
datapackage_pipelines_mojp.common.processors.sync:     six.raise_from(e, None)
datapackage_pipelines_mojp.common.processors.sync:   File "", line 2, in raise_from
datapackage_pipelines_mojp.common.processors.sync:   File "/usr/local/lib/python3.6/site-packages/urllib3/connectionpool.py", line 382, in _make_request
datapackage_pipelines_mojp.common.processors.sync:     httplib_response = conn.getresponse()
datapackage_pipelines_mojp.common.processors.sync:   File "/usr/local/lib/python3.6/http/client.py", line 1331, in getresponse
datapackage_pipelines_mojp.common.processors.sync:     response.begin()
datapackage_pipelines_mojp.common.processors.sync:   File "/usr/local/lib/python3.6/http/client.py", line 297, in begin
datapackage_pipelines_mojp.common.processors.sync:     version, status, reason = self._read_status()
datapackage_pipelines_mojp.common.processors.sync:   File "/usr/local/lib/python3.6/http/client.py", line 258, in _read_status
datapackage_pipelines_mojp.common.processors.sync:     line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1")
datapackage_pipelines_mojp.common.processors.sync:   File "/usr/local/lib/python3.6/socket.py", line 586, in readinto
datapackage_pipelines_mojp.common.processors.sync:     return self._sock.recv_into(b)
datapackage_pipelines_mojp.common.processors.sync: socket.timeout: timed out
datapackage_pipelines_mojp.common.processors.sync: During handling of the above exception, another exception occurred:
datapackage_pipelines_mojp.common.processors.sync: Traceback (most recent call last):
datapackage_pipelines_mojp.common.processors.sync:   File "/usr/local/lib/python3.6/site-packages/elasticsearch/connection/http_urllib3.py", line 114, in perform_request
datapackage_pipelines_mojp.common.processors.sync:     response = self.pool.urlopen(method, url, body, retries=False, headers=self.headers, **kw)
datapackage_pipelines_mojp.common.processors.sync:   File "/usr/local/lib/python3.6/site-packages/urllib3/connectionpool.py", line 649, in urlopen
datapackage_pipelines_mojp.common.processors.sync:     _stacktrace=sys.exc_info()[2])
datapackage_pipelines_mojp.common.processors.sync:   File "/usr/local/lib/python3.6/site-packages/urllib3/util/retry.py", line 333, in increment
datapackage_pipelines_mojp.common.processors.sync:     raise six.reraise(type(error), error, _stacktrace)
datapackage_pipelines_mojp.common.processors.sync:   File "/usr/local/lib/python3.6/site-packages/urllib3/packages/six.py", line 686, in reraise
datapackage_pipelines_mojp.common.processors.sync:     raise value
datapackage_pipelines_mojp.common.processors.sync:   File "/usr/local/lib/python3.6/site-packages/urllib3/connectionpool.py", line 600, in urlopen
datapackage_pipelines_mojp.common.processors.sync:     chunked=chunked)
datapackage_pipelines_mojp.common.processors.sync:   File "/usr/local/lib/python3.6/site-packages/urllib3/connectionpool.py", line 388, in _make_request
datapackage_pipelines_mojp.common.processors.sync:     self._raise_timeout(err=e, url=url, timeout_value=read_timeout)
datapackage_pipelines_mojp.common.processors.sync:   File "/usr/local/lib/python3.6/site-packages/urllib3/connectionpool.py", line 308, in _raise_timeout
datapackage_pipelines_mojp.common.processors.sync:     raise ReadTimeoutError(self, url, "Read timed out. (read timeout=%s)" % timeout_value)
datapackage_pipelines_mojp.common.processors.sync: urllib3.exceptions.ReadTimeoutError: HTTPConnectionPool(host='192.168.170.195', port=9200): Read timed out. (read timeout=10)
datapackage_pipelines_mojp.common.processors.sync: During handling of the above exception, another exception occurred:
datapackage_pipelines_mojp.common.processors.sync: Traceback (most recent call last):
datapackage_pipelines_mojp.common.processors.sync:   File "/mojp/datapackage_pipelines_mojp/common/processors/sync.py", line 148, in 
datapackage_pipelines_mojp.common.processors.sync:     CommonSyncProcessor.main()
datapackage_pipelines_mojp.common.processors.sync:   File "/mojp/datapackage_pipelines_mojp/common/processors/base_processors.py", line 23, in main
datapackage_pipelines_mojp.common.processors.sync:     spew(*cls(*ingest()).spew())
datapackage_pipelines_mojp.common.processors.sync:   File "/usr/local/lib/python3.6/site-packages/datapackage_pipelines/wrapper/wrapper.py", line 59, in spew
datapackage_pipelines_mojp.common.processors.sync:     for rec in res:
datapackage_pipelines_mojp.common.processors.sync:   File "/mojp/datapackage_pipelines_mojp/common/processors/base_processors.py", line 76, in _filter_resource
datapackage_pipelines_mojp.common.processors.sync:     yield self._filter_row(row, descriptor)
datapackage_pipelines_mojp.common.processors.sync:   File "/mojp/datapackage_pipelines_mojp/common/processors/sync.py", line 135, in _filter_row
datapackage_pipelines_mojp.common.processors.sync:     return self._update_doc(new_doc, old_doc)
datapackage_pipelines_mojp.common.processors.sync:   File "/mojp/datapackage_pipelines_mojp/common/processors/sync.py", line 70, in _update_doc
datapackage_pipelines_mojp.common.processors.sync:     body={"doc": new_doc})
datapackage_pipelines_mojp.common.processors.sync:   File "/usr/local/lib/python3.6/site-packages/elasticsearch/client/utils.py", line 73, in _wrapped
datapackage_pipelines_mojp.common.processors.sync:     return func(*args, params=params, **kwargs)
datapackage_pipelines_mojp.common.processors.sync:   File "/usr/local/lib/python3.6/site-packages/elasticsearch/client/__init__.py", line 525, in update
datapackage_pipelines_mojp.common.processors.sync:     doc_type, id, '_update'), params=params, body=body)
datapackage_pipelines_mojp.common.processors.sync:   File "/usr/local/lib/python3.6/site-packages/elasticsearch/transport.py", line 312, in perform_request
datapackage_pipelines_mojp.common.processors.sync:     status, headers, data = connection.perform_request(method, url, params, body, ignore=ignore, timeout=timeout)
datapackage_pipelines_mojp.common.processors.sync:   File "/usr/local/lib/python3.6/site-packages/elasticsearch/connection/http_urllib3.py", line 122, in perform_request
datapackage_pipelines_mojp.common.processors.sync:     raise ConnectionTimeout('TIMEOUT', str(e), e)
datapackage_pipelines_mojp.common.processors.sync: elasticsearch.exceptions.ConnectionTimeout: ConnectionTimeout caused by - ReadTimeoutError(HTTPConnectionPool(host='192.168.170.195', port=9200): Read timed out. (read timeout=10))
datapackage_pipelines_mojp.clearmash.processors.convert: ERROR:wrapper:Output pipe disappeared!

@Libisch commented on Sun Jul 02 2017

@OriHoch


@OriHoch commented on Sun Jul 02 2017

looks like it can't connect to elasticsearch

elasticsearch.exceptions.ConnectionTimeout: ConnectionTimeout caused by - ReadTimeoutError(HTTPConnectionPool(host='192.168.170.195', port=9200): Read timed out. (read timeout=10))

@Libisch commented on Sun Jul 02 2017

I figured that one :) had some connection issues that were fixed by modifying the network.host in both docker-compose.override.yaml and elastisearch.yaml, but I'm not sure how to handle the timeout issue. It happens after the app is running for quite a while.

OriHoch commented 7 years ago

check if elasticsearch is running and available on 192.168.170.195:9200 from both host and from inside the docker

you can run something like this to run commands from inside the docker instance: docker exec -it docker-instance-name /bin/sh (replace docker-instance-name with the actual instance name)

then, inside docker, you can do curl 192.168.170.195:9200 to check if elasticsearch is running

OriHoch commented 7 years ago

also, check elasticsearch logs for errors sudo cat /var/log/elasticsearch/elasticsearch.log

OriHoch commented 7 years ago

@Libisch this is resolved? if not, please paste the logs

Libisch commented 7 years ago

@OriHoch I ran the checks and everything looked o.k. Anyway, the error hasn't reoccurred since. I'll update if it happens again.