Open ShinjiLE opened 12 months ago
Good to know somebody ~solved it~ worked around the problem. I did it the other way round: I increased the limit in the index configuration:
curl -XPUT http://127.0.0.1:9200/${INDEX}/_settings -H 'Content-Type: application/json' -d '{"index":{"highlight.max_analyzed_offset":1000000000}}'
This has the benefit that it's no source modification: Nextclouds self-diagnostic does not complain about integrity check failure, and the next update won't overwrite the change.
But, in the long run, both solutions are no real ones. I remember a time (maybe this was before elasticsearch, don't really remember) where there was a setting in the Fulltextsearch app: do not index documents above a given limit. Maybe this setting is deemed outdated, given the sheer sizes of nowadays' documents. But the correct solution would be that the size of documents to be indexed and the size limits of documents to be searched should correspond somehow.
I'm getting 0 search results in version 27.0.4, but no errors whatsoever in the browser console or Nextcloud log. I've tried tweaking both settings by editing SearchMappingService.php as well as highlight.max_analyzed_offset in the Elasticsearch configuration, but to no avail. Reverting the plugin to version 27.0.2 returns the search results normally. I wish I could be more of help, but there seems to be something more going on with the regression.
Update: in addition to increasing max_analyzed_offset
, resetting and then reindexing ultimately fixed it for me. Perhaps something was corrupted in the index that didn't cause an issue in 27.0.2 but did 27.04.
I have experienced this problem with big PDFs . If the search find some of this documents the result page shows no results on the first occurrence of such an item . The log shows :
I solved this problem by adding
'max_analyzed_offset' => '999999',
in SearchMappingService.php atprivate function generateSearchHighlighting(ISearchRequest $request): array
above'fields' => $fields,