HazyResearch / deepdive

DeepDive
deepdive.stanford.edu
1.95k stars 539 forks source link

Spouse example, cant run do sentences #605

Open faizann24 opened 7 years ago

faizann24 commented 7 years ago

Everything goes fine but when I run deepdive do sentences, it gives me errors. Its very frustrating.

Here's the error. I am using example from 0.8x and deepdive 0.8 as well..

loading dd_tmp_sentences: 0:01:48 0 B [ 0 B/s] ([ 0 B/s])2016-11-21 15:38:07.394801 Loading parser from serialized file edu/stanford/nlp/models/srparser/englishSR.ser.gz ...OpenJDK 64-Bit Server VM warning: INFO: os::commit_memory(0x00000006e8480000, 165675008, 0) failed; error='Cannot allocate memory' (errno=12) 2016-11-21 15:38:17.019338 /home/faizan/Desktop/spouseexample/spouse_example-0.8-STABLE/udf/bazaar/parser/run.sh: line 5: 2899 Killed $(dirname $0)/target/start $@ 2016-11-21 15:38:17.056907 /home/faizan/local/util/compute-driver/local/compute-execute: line 140: kill: (2822) - No such process 2016-11-21 15:38:17.057060 /home/faizan/local/util/compute-driver/local/compute-execute: line 140: kill: (2823) - No such process 2016-11-21 15:38:17.057097 /home/faizan/local/util/compute-driver/local/compute-execute: line 140: kill: (2833) - No such process loading dd_tmp_sentences: 0:01:49 0 [ 0 /s] ([ 0 /s])iver/local/compute-loading dd_tmp_sentences: 0:01:49 0 B [ 0 B/s] ([ 0 B/s]) 2016-11-21 15:38:17.646916 COPY 0 2016-11-21 15:38:18.896096 [ERROR] command='"$DEEPDIVE_APP"/udf/nlp_markup.sh': PID 2822: finished with non-zero exit status (0) 2016-11-21 15:38:18.897083 /home/faizan/local/util/compute-driver/local/compute-execute: line 138: 2824 Terminated DEEPDIVE_CURRENT_PROCESS_INDEX=$i bash -c "$command" < process-$i.input > process-$i.output 2016-11-21 15:38:18.897151 /home/faizan/local/util/compute-driver/local/compute-execute: line 138: 2828 Terminated mkmimo process-.output > output_computed- 2016-11-21 15:38:18.897184 /home/faizan/local/util/compute-driver/local/compute-execute: line 138: 2829 Terminated deepdive-load "$output_relation_tmp" output_computed-* 2016-11-21 15:38:37.197784 done [79.4 sec]. 2016-11-21 15:38:38.106718 Parsing document f642dff5-2cd3-46a0-b530-792529a8ebb2... 2016-11-21 15:38:41.839660 Parsing document d63353aa-58f1-413d-8f60-fac44c41d4b7... 2016-11-21 15:38:42.929657 Parsing document b4968e78-ec5a-466e-863f-fef18e8ae99d... 2016-11-21 15:38:45.197607 Parsing document 46c3af65-1da9-459b-a98d-6f0eae12577e... 2016-11-21 15:38:45.577450 Parsing document 43c02bac-d556-4851-9b04-7773700759b6... 2016-11-21 15:38:47.381711 Parsing document db60062c-cd21-40a5-ab3b-648d5a320bf2... 2016-11-21 15:38:47.909429 Parsing document 7d4928a5-34fd-4d1c-894f-8e165fcf61aa... 2016-11-21 15:38:48.824839 Parsing document bd104c74-afa6-4d79-bf24-3249c7643130... 2016-11-21 15:38:49.632103 Parsing document 18658e4a-a94e-478f-ab2e-2ee709bd47e5... 2016-11-21 15:38:50.025989 Parsing document 328b5f1c-2b52-4eac-916c-c7983d4882a4... 2016-11-21 15:38:53.365035 Parsing document 70ecb3ad-cbbe-4097-8608-e3373a34a728... 2016-11-21 15:38:54.357790 Parsing document 36349778-9942-475d-bdf2-23b7372911c1... 2016-11-21 15:38:56.820872 Parsing document 5e7a035e-3a52-4133-afb7-564423d6b1b0... 2016-11-21 15:38:58.411508 Parsing document e4304e4e-a09f-4053-acb4-f1042e5f132e... 2016-11-21 15:38:59.286233 Parsing document 5dd9bf47-c8a9-49e3-8d02-994f8eabb91a... 2016-11-21 15:38:59.557480 Parsing document c4a1f668-b653-45d9-a95a-bb692c3e81cb... 2016-11-21 15:39:00.500847 Parsing document 08679f4f-0b8b-452b-8cca-4cb538993417... 2016-11-21 15:39:00.920752 Parsing document d969e80b-6b4a-4cda-ba5c-18c46b0fbd39... 2016-11-21 15:39:01.974830 Parsing document 981d477d-469f-4b90-906d-214d63506beb... 2016-11-21 15:39:02.708998 Parsing document 68145e40-fdaf-4244-87b0-942740eba969... 2016-11-21 15:39:02.890193 Parsing document 82c0c1d4-45b1-4861-b880-c414de299948... 2016-11-21 15:39:03.594063 Parsing document e70f708c-eb4f-4a8f-97ec-8f50e865ad06... 2016-11-21 15:39:03.786544 Parsing document 7e5f4072-b69f-4819-8ed6-62bdd0100621... 2016-11-21 15:39:04.710042 Parsing document d6880afb-7fcb-4576-9d17-cedd343677f9... 2016-11-21 15:39:05.504884 Parsing document ec1133a0-b707-4eda-a819-de0bb47180fe... 2016-11-21 15:39:05.797613 Parsing document 3deea828-e3a1-4c4d-9a90-482cabc020d8... 2016-11-21 15:39:07.213326 Parsing document 507d552f-db6a-495e-b78a-95783cad9af1... 2016-11-21 15:39:07.779326 Parsing document 3011fed2-1784-47f8-834c-c1f1af79e476... 2016-11-21 15:39:08.390949 Parsing document 340bb625-bb7e-49af-aa8d-781e5762f7a3... 2016-11-21 15:39:09.980964 Parsing document c902ad3d-3798-4c00-b09c-c9079de7bf49... 2016-11-21 15:39:10.426243 Parsing document 79205745-b593-4b98-8a94-da6b8238fefc... 2016-11-21 15:39:11.509975 Parsing document 9662058b-fca5-4771-8058-c7fd7bd548a3... ‘run/ABORTED’ -> ‘20161121/153606.426706542’

netj commented 7 years ago

Sorry for the frustration. Bazaar/Parser, DeepDive 0.8 used to rely on had various build issues/bugs, so we're migrating to directly call CoreNLP in the upcoming 0.9 release. If you need a fully parsed signalmedia 1m corpus, please consider using the already processed data linked from #566. If you're trying to parse your own text data, please check out the sandbox docker image that holds the release candidate: https://github.com/HazyResearch/deepdive/blob/master/doc/installation.md#launch-without-installing