Closed OkkeKlein closed 9 years ago
Have to tried configuring your Solr Committer for retry?
<committer class="com.norconex.committer.solr.SolrCommitter">
...
<maxRetries>10</maxRetries>
<maxRetryWait>60000</maxRetryWait>
</committer>
The above will try up to 10 times upon failure, waiting 1 minute between each attempts. This should give enough time to warmup.
You might want to check this parameter in the config file. commitDisabled Disable the sending of commit commands to the Solr Server.in the solr commiter config.
it might solve the problem since the Solr server will be in charge of the commits and not the solrj client.
here is a link to the feature request https://github.com/Norconex/committer-solr/issues/4
here is a link to the documentation of the tag name commitDisabled http://www.norconex.com/collectors/committer-solr/configuration
Will maxRetryWait work if I have 100+ batches that need deleting? Could be an option, but might also get messy.
Using Solr to handle the commits is an option, but there are scenario's where you want to control the opening of a new searcher.
Easiest way to prevent this is to add delay before committing next batch imo,
Good idea. I am marking this as a feature request to add a flag for a "minimum delay between commits".
Created a new feature request here: https://github.com/Norconex/committer-solr/issues/6. Closing this one.
While testing, the collector issued a lot of deletion commands to Solr in short period of time, so Solr didn't have enough time to warm a new seacher.
website: 2015-07-09 15:37:17 ERROR - website: Could not process document: http://www.XXX (Cannot index document batch to Solr.) com.norconex.committer.core.CommitterException: Cannot index document batch to Solr.
Caused by: org.apache.solr.client.solrj.impl.HttpSolrClient$RemoteSolrException: Error from server at http://XXX: Error opening new searcher. exceeded limit of maxWarmingSearchers=2, try again later.
Under normal circumstances this would not happen, so this issue is just to inform.