enonic / xp

Enonic XP
https://enonic.com
GNU General Public License v3.0
201 stars 34 forks source link

Dump error when repo has more than 10000 nodes #7124

Closed ComLock closed 5 years ago

ComLock commented 5 years ago

xpVersion = 7.0.0

Using data tool box app. Right after finishing:

enonic dump upgrade
enonic dump load

Log

13:46:16.330 ERROR s.r.enonic.datatoolbox.RcdScriptBean - Error while creating dump
com.enonic.xp.repo.impl.dump.RepoDumpException: Error occurred when dumping repository com.enonic.yase
    at com.enonic.xp.repo.impl.dump.RepoDumper.doExecute(RepoDumper.java:122)
    at com.enonic.xp.context.ContextImpl.callWith(ContextImpl.java:101)
    at com.enonic.xp.repo.impl.dump.RepoDumper.lambda$execute$0(RepoDumper.java:94)
    at java.base/java.lang.Iterable.forEach(Iterable.java:75)
    at com.enonic.xp.repo.impl.dump.RepoDumper.execute(RepoDumper.java:94)
    at com.enonic.xp.repo.impl.dump.DumpServiceImpl.dump(DumpServiceImpl.java:229)
    at systems.rcd.enonic.datatoolbox.RcdDumpScriptBean.lambda$create$2(RcdDumpScriptBean.java:186)
    at systems.rcd.enonic.datatoolbox.RcdScriptBean.runSafely(RcdScriptBean.java:46)
    at systems.rcd.enonic.datatoolbox.RcdScriptBean.runSafely(RcdScriptBean.java:39)
    at systems.rcd.enonic.datatoolbox.RcdDumpScriptBean.create(RcdDumpScriptBean.java:176)
    at jdk.scripting.nashorn.scripts/jdk.nashorn.internal.scripts.Script$Recompilation$956$499A$dump_create.L:1#post#task(systems.rcd.enonic.datatoolbox:/services/dump-create/dump-create.js:15)
    at jdk.nashorn.javaadapters.java_util_function_Function.apply(Unknown Source)
    at com.enonic.xp.lib.task.TaskWrapper.runTask(TaskWrapper.java:38)
    at com.enonic.xp.task.TaskProgressReporterContext.lambda$withContext$0(TaskProgressReporterContext.java:36)
    at com.enonic.xp.lib.task.TaskWrapper.run(TaskWrapper.java:31)
    at com.enonic.xp.impl.task.TaskWrapper.lambda$callTaskWithContext$0(TaskWrapper.java:90)
    at com.enonic.xp.context.ContextImpl.callWith(ContextImpl.java:101)
    at com.enonic.xp.impl.task.TaskWrapper.callTaskWithContext(TaskWrapper.java:89)
    at com.enonic.xp.impl.task.TaskWrapper.doRun(TaskWrapper.java:75)
    at com.enonic.xp.impl.task.TaskWrapper.run(TaskWrapper.java:59)
    at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
    at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
    at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
    at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
    at java.base/java.lang.Thread.run(Thread.java:834)
Caused by: com.enonic.xp.repository.IndexException: Search request failed after [30s], query: [{
  "from" : 10000,
  "size" : 5000,
  "query" : {
    "wildcard" : {
      "_parentpath" : "/*"
    }
  },
  "explain" : false,
  "sort" : [ {
    "_path._orderby" : {
      "order" : "asc",
      "ignore_unmapped" : true
    }
  } ]
}]
    at com.enonic.xp.repo.impl.elasticsearch.executor.AbstractExecutor.doSearchRequest(AbstractExecutor.java:66)
    at com.enonic.xp.repo.impl.elasticsearch.executor.SearchExecutor.doSearch(SearchExecutor.java:77)
    at com.enonic.xp.repo.impl.elasticsearch.executor.SearchExecutor.execute(SearchExecutor.java:63)
    at com.enonic.xp.repo.impl.elasticsearch.search.SearchDaoImpl.search(SearchDaoImpl.java:23)
    at com.enonic.xp.repo.impl.search.NodeSearchServiceImpl.doQuery(NodeSearchServiceImpl.java:57)
    at com.enonic.xp.repo.impl.search.NodeSearchServiceImpl.query(NodeSearchServiceImpl.java:40)
    at com.enonic.xp.repo.impl.node.FindNodeIdsByParentCommand.execute(FindNodeIdsByParentCommand.java:79)
    at com.enonic.xp.repo.impl.node.NodeServiceImpl.executeFindByParent(NodeServiceImpl.java:285)
    at com.enonic.xp.repo.impl.node.NodeServiceImpl.findByParent(NodeServiceImpl.java:254)
    at com.enonic.xp.repo.impl.node.executor.BatchedGetChildrenExecutor.execute(BatchedGetChildrenExecutor.java:55)
    at com.enonic.xp.repo.impl.dump.RepoDumper.dumpBranch(RepoDumper.java:154)
    at com.enonic.xp.repo.impl.dump.RepoDumper.doExecute(RepoDumper.java:118)
    ... 24 common frames omitted
Caused by: org.elasticsearch.action.search.SearchPhaseExecutionException: all shards failed
    at org.elasticsearch.action.search.AbstractSearchAsyncAction.onFirstPhaseResult(AbstractSearchAsyncAction.java:206)
    at org.elasticsearch.action.search.AbstractSearchAsyncAction$1.onFailure(AbstractSearchAsyncAction.java:152)
    at org.elasticsearch.action.ActionListenerResponseHandler.handleException(ActionListenerResponseHandler.java:46)
    at org.elasticsearch.transport.TransportService$DirectResponseChannel.processException(TransportService.java:874)
    at org.elasticsearch.transport.TransportService$DirectResponseChannel.sendResponse(TransportService.java:852)
    at org.elasticsearch.transport.TransportService$4.onFailure(TransportService.java:389)
    at org.elasticsearch.common.util.concurrent.AbstractRunnable.run(AbstractRunnable.java:39)
    ... 3 common frames omitted
Caused by: org.elasticsearch.search.query.QueryPhaseExecutionException: Result window is too large, from + size must be less than or equal to: [10000] but was [15000]. See the scroll api for a more efficient way to request large data sets. This limit can be set by changing the [index.max_result_window] index level parameter.
    at org.elasticsearch.search.internal.DefaultSearchContext.preProcess(DefaultSearchContext.java:212)
    at org.elasticsearch.search.query.QueryPhase.preProcess(QueryPhase.java:103)
    at org.elasticsearch.search.SearchService.createContext(SearchService.java:689)
    at org.elasticsearch.search.SearchService.createAndPutContext(SearchService.java:633)
    at org.elasticsearch.search.SearchService.executeFetchPhase(SearchService.java:472)
    at org.elasticsearch.search.action.SearchServiceTransportAction$SearchQueryFetchTransportHandler.messageReceived(SearchServiceTransportAction.java:392)
    at org.elasticsearch.search.action.SearchServiceTransportAction$SearchQueryFetchTransportHandler.messageReceived(SearchServiceTransportAction.java:389)
    at org.elasticsearch.transport.TransportRequestHandler.messageReceived(TransportRequestHandler.java:33)
    at org.elasticsearch.transport.RequestHandlerRegistry.processMessageReceived(RequestHandlerRegistry.java:77)
    at org.elasticsearch.transport.TransportService$4.doRun(TransportService.java:378)
    at org.elasticsearch.common.util.concurrent.AbstractRunnable.run(AbstractRunnable.java:37)
    ... 3 common frames omitted
GlennRicaud commented 5 years ago

Again another breaking change not documented by ElasticSearch team https://www.elastic.co/guide/en/elasticsearch/reference/2.4/index-modules.html

GlennRicaud commented 5 years ago

We should increate this new limit for now and release a new version 7.0.1 (same behaviour as Enonic XP 6 in theory).

In parallel, start to use search-request-scroll in the places where we scroll through a large set of data. (dumps for example, exports also if many children per parent)