Closed lemmikens closed 6 years ago
I am not sure if that is your main issue, but your logs are not sending anything because every file is being rejected for not being modified since the previous run. The crawler does "incremental" indexing where only documents that are new, modified, or deleted will be sent to your Committer on subsequent runs.
To start fresh and forget about previous runs, delete your "workdir" and start again (or at a minimum, delete the "crawlstore").
Please give it a try and confirm.
Deleting the work directory portion of the xml did it! Thank you! It must have been trying to commit to that instead of Elasticsearch
I was actually suggesting to delete the "workdir" directory, not the config entry. :-) Deleting it reverted to using the default location, which appears as a clean workdir to the Collector. That's fine for now, but if you need to do this again, you will have to delete the directory next time.
Hi, I've been messing around with the elasticsearch committer for a week or so, and for the life of me, I can't get the collector to commit to elasticsearch. There is no error when I run the collector, so it makes it very difficult to troubleshoot... I have a feeling that I'm just missing something small, but it could possibly be a bug.
Below is the xml config:
and here are the logs when I run it: