Closed elmeyer closed 4 years ago
Hello,
Macaw runs multiple actions at the same time. There is a timeout for every action. You can increase that timeout value in `live_main.py' in the param dict. The first error you get is because Dr.QA couldn't generate the response within the specified time interval and thus its process is killed. But still the user sees the response, which is a result list of three documents mentioned after "THE RESPONSE STARTS". All in all, this is not an error, this is just an exception raised by the Dr.QA model because its process is killed for the timeout reason.
About the second error, I don't recommend changing the tokenizer to CoreNLP. It has some problems with python multiprocessing as you see in your error. The simple tokenizer should sufficient in most cases.
Finally, the Figure 1 is produced by Macaw, but some the models used to produce those responses are not part of this open-source project. Therefore, Figure 1 can be potentially replicated by Macaw, but the current released models may not exactly replicate the same responses.
I hope this helps. Please let me know if you have more questions.
Best, Hamed Zamani
I am trying to get Macaw to work, expecting results similar to Figure 1 (b) in the paper. Currently, I am working based on a clean
ubuntu:bionic
Docker image, because it provides default Python 3.6 and lets me install Java 8 (for Stanford CoreNLP). Long story short, I am able to runpython3 live_main.py
and arrive at theENTER COMMAND:
prompt withstdio
as the interface.Firstly, the default
simple
tokenizer as set in drqa_mrc.py causes the following error (but of course does not affect retrieval of a list of URLs from Bing):... which makes sense, given the earlier warning from DrQA stating:
Switching to the
corenlp
tokenizer and re-runningpython3 setup.py install
for the change to take effect results in the following output with the same query:If it's any help, I am able to use the DrQA interactive demo with the Stanford CoreNLP tokenizer without errors.
Aside from these errors, maybe I have misunderstood the capabilities or scope of the
live_main.py
demo? Would it be capable of an interaction similar to the one shown in Figure 1 (b) in the paper?Thanks in advance for all assistance!