Closed markharwood closed 9 years ago
We currently handle lists of terms as an automaton just like we handle regular expressions for simplicity. I guess we could specialize the terms list case if we want large lists to work.
Per @jpountz , this error is only affecting the master branch. Just want to confirm that this will not get introduced to 1.6 when it gets released for we already have users using the exclude exact value with large number of values :)
@markharwood cool thx!
A
terms
agg with anexclude
arrray of medium size (in my case, 86 strings) was sufficient to cause this error:Example query:
I can see how consulting large sets of terms can be expensive and we might want to cap it but in the above case this risk is mitigated through the use of the "sampler" agg to consider a maxiumum of 100 top-matching docs.
The use case for this type of query is looking for new terms outside of a set that has already been gathered by the client eg aiding graph exploration by looking for connections beyond what you already have collected:
These sorts of exclude lists can grow large so a cap of around 85 terms seems low for this use case.