I downloaded the repo and then tried to run it in a Jupyter notebook but got the following error:
---------------------------------------------------------------------------
NameError Traceback (most recent call last)
<ipython-input-3-18f31a175c27> in <module>()
----> 1 nlp_babi
NameError: name 'nlp_babi' is not defined
I don't know what is going on here?
I found a similar issue on github which suggested re-installing CoreNLP, I tried this and then got this error:
---------------------------------------------------------------------------
KeyError Traceback (most recent call last)
<ipython-input-3-98edacc8b218> in <module>()
2
3 corpus_parser = CorpusParser()
----> 4 get_ipython().magic(u'time corpus_parser.apply(doc_preprocessor)')
/usr/local/lib/python2.7/dist-packages/IPython/core/interactiveshell.pyc in magic(self, arg_s)
2156 magic_name, _, magic_arg_s = arg_s.partition(' ')
2157 magic_name = magic_name.lstrip(prefilter.ESC_MAGIC)
-> 2158 return self.run_line_magic(magic_name, magic_arg_s)
2159
2160 #-------------------------------------------------------------------------
/usr/local/lib/python2.7/dist-packages/IPython/core/interactiveshell.pyc in run_line_magic(self, magic_name, line)
2077 kwargs['local_ns'] = sys._getframe(stack_depth).f_locals
2078 with self.builtin_trap:
-> 2079 result = fn(*args,**kwargs)
2080 return result
2081
<decorator-gen-59> in time(self, line, cell, local_ns)
/usr/local/lib/python2.7/dist-packages/IPython/core/magic.pyc in <lambda>(f, *a, **k)
186 # but it's overkill for just that one bit of state.
187 def magic_deco(arg):
--> 188 call = lambda f, *a, **k: f(*a, **k)
189
190 if callable(arg):
/usr/local/lib/python2.7/dist-packages/IPython/core/magics/execution.pyc in time(self, line, cell, local_ns)
1179 if mode=='eval':
1180 st = clock2()
-> 1181 out = eval(code, glob, local_ns)
1182 end = clock2()
1183 else:
<timed eval> in <module>()
/data/snorkel/snorkel/udf.pyc in apply(self, xs, clear, parallelism, progress_bar, count, **kwargs)
38 print "Running UDF..."
39 if parallelism is None or parallelism < 2:
---> 40 self.apply_st(xs, progress_bar, clear=clear, count=count, **kwargs)
41 else:
42 self.apply_mt(xs, parallelism, clear=clear, **kwargs)
/data/snorkel/snorkel/udf.pyc in apply_st(self, xs, progress_bar, count, **kwargs)
61
62 # Apply UDF and add results to the session
---> 63 for y in udf.apply(x, **kwargs):
64
65 # Uf UDF has a reduce step, this will take care of the insert; else add to session
/data/snorkel/snorkel/parser.pyc in apply(self, x, **kwargs)
46 """Given a Document object and its raw text, parse into processed Sentences"""
47 doc, text = x
---> 48 for parts in self.corenlp_handler.parse(doc, text):
49 parts = self.fn(parts) if self.fn is not None else parts
50 yield Sentence(**parts)
/data/snorkel/snorkel/parser.pyc in parse(self, document, text)
323 parts = defaultdict(list)
324 dep_order, dep_par, dep_lab = [], [], []
--> 325 for tok, deps in zip(block['tokens'], block['basic-dependencies']):
326 # Convert PennTreeBank symbols back into characters for words/lemmas
327 parts['words'].append(PTB.get(tok['word'], tok['word']))
KeyError: 'basic-dependencies'
I don't know what's going on here?? Also the readthedocs documentation for this repo seems to be broken. And I'm a little uneasy about level of PEP-8 compliance as well...
I downloaded the repo and then tried to run it in a Jupyter notebook but got the following error:
I don't know what is going on here?
I found a similar issue on github which suggested re-installing CoreNLP, I tried this and then got this error:
I don't know what's going on here?? Also the readthedocs documentation for this repo seems to be broken. And I'm a little uneasy about level of PEP-8 compliance as well...
Thanks! Alex