Closed fruviad closed 4 years ago
I'm starting with a fresh install.
I'm working with code downloaded from github on April 2, 2020. I'm still very new to both Docker & Combine, so apologies in advance if this is a case of PEBKAC.
When I go to the "Job Details" pages (e.g. "http://192.168.1.2:8000/combine/organization/4/record_group/8/job/12/details") for jobs in this install, I see the following error:
An error occurred when running this Job: RequestError TransportError(400, 'illegal_argument_exception', 'Malformed [mappings] section for type [date_detection], should include an inner object describing the mapping')...
Thus far I've created a few harvest and analysis jobs, and the problem manifests with both types. The error seems to be non-critical, as the job appear to be completing successfully (i.e. no "Error in processing" for analysis jobs, and records successfully harvested in harvest jobs) but I'm still completely new to Combine, and massive chunks of functionality could be missing without me realizing it.
If I go to the "Spark Details" tab then I see the following error information:
livy_response{5} | livy_response | | {5} -- | -- | -- | -- livy_response | | {5} code:from jobs import MergeSpark\nMergeSpark(spark, job_id=\"13\").spark_function() | | code | : | from jobs import MergeSpark\nMergeSpark(spark, job_id=\"13\").spark_function() | code | : | from jobs import MergeSpark\nMergeSpark(spark, job_id=\"13\").spark_function() id:0 | | id | : | 0 | id | : | 0 output{5} | | output | | {5} | output | | {5} ename:RequestError | | ename | : | RequestError | ename | : | RequestError evalue:TransportError(400, 'illegal_argument_exception', 'Malformed [mappings] section for type [date_detection], should include an inner object describing the mapping') | | evalue | : | TransportError(400, 'illegal_argument_exception', 'Malformed [mappings] section for type [date_detection], should include an inner object describing the mapping') | evalue | : | TransportError(400, 'illegal_argument_exception', 'Malformed [mappings] section for type [date_detection], should include an inner object describing the mapping') execution_count:0 | | execution_count | : | 0 | execution_count | : | 0 status:error | | status | : | error | status | : | error traceback[10] | | traceback | | [10] | traceback | | [10] 0:Traceback (most recent call last):\n | | 0 | : | Traceback (most recent call last):\n | 0 | : | Traceback (most recent call last):\n 1: File \"/tmp/spark-365eed22-30ae-40f4-b546-958b18eb0748/userFiles-a23e577d-04ec-4311-afd8-4477ba143a81/jobs.py\", line 1605, in spark_function\n write_avro=write_avro\n | | 1 | : | File \"/tmp/spark-365eed22-30ae-40f4-b546-958b18eb0748/userFiles-a23e577d-04ec-4311-afd8-4477ba143a81/jobs.py\", line 1605, in spark_function\n write_avro=write_avro\n | 1 | : | File \"/tmp/spark-365eed22-30ae-40f4-b546-958b18eb0748/userFiles-a23e577d-04ec-4311-afd8-4477ba143a81/jobs.py\", line 1605, in spark_function\n write_avro=write_avro\n 2: File \"/tmp/spark-365eed22-30ae-40f4-b546-958b18eb0748/userFiles-a23e577d-04ec-4311-afd8-4477ba143a81/jobs.py\", line 429, in save_records\n field_mapper_config=self.job_details['field_mapper_config']\n | | 2 | : | File \"/tmp/spark-365eed22-30ae-40f4-b546-958b18eb0748/userFiles-a23e577d-04ec-4311-afd8-4477ba143a81/jobs.py\", line 429, in save_records\n field_mapper_config=self.job_details['field_mapper_config']\n | 2 | : | File \"/tmp/spark-365eed22-30ae-40f4-b546-958b18eb0748/userFiles-a23e577d-04ec-4311-afd8-4477ba143a81/jobs.py\", line 429, in save_records\n field_mapper_config=self.job_details['field_mapper_config']\n 3: File \"/tmp/spark-365eed22-30ae-40f4-b546-958b18eb0748/userFiles-a23e577d-04ec-4311-afd8-4477ba143a81/es.py\", line 152, in index_job_to_es_spark\n 'combine_template', body=json.dumps(template_body))\n | | 3 | : | File \"/tmp/spark-365eed22-30ae-40f4-b546-958b18eb0748/userFiles-a23e577d-04ec-4311-afd8-4477ba143a81/es.py\", line 152, in index_job_to_es_spark\n 'combine_template', body=json.dumps(template_body))\n | 3 | : | File \"/tmp/spark-365eed22-30ae-40f4-b546-958b18eb0748/userFiles-a23e577d-04ec-4311-afd8-4477ba143a81/es.py\", line 152, in index_job_to_es_spark\n 'combine_template', body=json.dumps(template_body))\n 4: File \"/opt/conda/envs/combine/lib/python3.5/site-packages/elasticsearch/client/utils.py\", line 73, in _wrapped\n return func(*args, params=params, **kwargs)\n | | 4 | : | File \"/opt/conda/envs/combine/lib/python3.5/site-packages/elasticsearch/client/utils.py\", line 73, in _wrapped\n return func(*args, params=params, **kwargs)\n | 4 | : | File \"/opt/conda/envs/combine/lib/python3.5/site-packages/elasticsearch/client/utils.py\", line 73, in _wrapped\n return func(*args, params=params, **kwargs)\n 5: File \"/opt/conda/envs/combine/lib/python3.5/site-packages/elasticsearch/client/indices.py\", line 458, in put_template\n name), params=params, body=body)\n | | 5 | : | File \"/opt/conda/envs/combine/lib/python3.5/site-packages/elasticsearch/client/indices.py\", line 458, in put_template\n name), params=params, body=body)\n | 5 | : | File \"/opt/conda/envs/combine/lib/python3.5/site-packages/elasticsearch/client/indices.py\", line 458, in put_template\n name), params=params, body=body)\n 6: File \"/opt/conda/envs/combine/lib/python3.5/site-packages/elasticsearch/transport.py\", line 312, in perform_request\n status, headers, data = connection.perform_request(method, url, params, body, ignore=ignore, timeout=timeout)\n | | 6 | : | File \"/opt/conda/envs/combine/lib/python3.5/site-packages/elasticsearch/transport.py\", line 312, in perform_request\n status, headers, data = connection.perform_request(method, url, params, body, ignore=ignore, timeout=timeout)\n | 6 | : | File \"/opt/conda/envs/combine/lib/python3.5/site-packages/elasticsearch/transport.py\", line 312, in perform_request\n status, headers, data = connection.perform_request(method, url, params, body, ignore=ignore, timeout=timeout)\n 7: File \"/opt/conda/envs/combine/lib/python3.5/site-packages/elasticsearch/connection/http_urllib3.py\", line 128, in perform_request\n self._raise_error(response.status, raw_data)\n | | 7 | : | File \"/opt/conda/envs/combine/lib/python3.5/site-packages/elasticsearch/connection/http_urllib3.py\", line 128, in perform_request\n self._raise_error(response.status, raw_data)\n | 7 | : | File \"/opt/conda/envs/combine/lib/python3.5/site-packages/elasticsearch/connection/http_urllib3.py\", line 128, in perform_request\n self._raise_error(response.status, raw_data)\n 8: File \"/opt/conda/envs/combine/lib/python3.5/site-packages/elasticsearch/connection/base.py\", line 125, in _raise_error\n raise HTTP_EXCEPTIONS.get(status_code, TransportError)(status_code, error_message, additional_info)\n | | 8 | : | File \"/opt/conda/envs/combine/lib/python3.5/site-packages/elasticsearch/connection/base.py\", line 125, in _raise_error\n raise HTTP_EXCEPTIONS.get(status_code, TransportError)(status_code, error_message, additional_info)\n | 8 | : | File \"/opt/conda/envs/combine/lib/python3.5/site-packages/elasticsearch/connection/base.py\", line 125, in _raise_error\n raise HTTP_EXCEPTIONS.get(status_code, TransportError)(status_code, error_message, additional_info)\n 9:elasticsearch.exceptions.RequestError: TransportError(400, 'illegal_argument_exception', 'Malformed [mappings] section for type [date_detection], should include an inner object describing the mapping')\n | | 9 | : | elasticsearch.exceptions.RequestError: TransportError(400, 'illegal_argument_exception', 'Malformed [mappings] section for type [date_detection], should include an inner object describing the mapping')\n | 9 | : | elasticsearch.exceptions.RequestError: TransportError(400, 'illegal_argument_exception', 'Malformed [mappings] section for type [date_detection], should include an inner object describing the mapping')\n progress:1 | | progress | : | 1 | progress | : | 1 state:available | | state | : | available | state | : | available
fruviad, How did you solve the problem? I am experiencing the same issue.
Sorry for the delay in responding! I didn't solve the problem. I ended up trying a reinstall with a later set of code and ran into a different set of issues.
I'm working with code downloaded from github on April 2, 2020. I'm still very new to both Docker & Combine, so apologies in advance if this is a case of PEBKAC.
When I go to the "Job Details" pages (e.g. "http://192.168.1.2:8000/combine/organization/4/record_group/8/job/12/details") for jobs in this install, I see the following error:
Thus far I've created a few harvest and analysis jobs, and the problem manifests with both types. The error seems to be non-critical, as the job appear to be completing successfully (i.e. no "Error in processing" for analysis jobs, and records successfully harvested in harvest jobs) but I'm still completely new to Combine, and massive chunks of functionality could be missing without me realizing it.
If I go to the "Spark Details" tab then I see the following error information: