Open joepio opened 1 year ago
For reference, there's a discord chat about this issue.
If you escape the special characters in the key, and use double quotes for the value, you can get it working:
let q_escaped_colons = r#"https\://example.com/test/bla:"https\://examplevalue.com/test""#;
let res = make_query_parser().parse_query(q_escaped_colons).unwrap();
It leads to a very ugly query, but it works!
Update 2: I also needed to escape the dots, as these are interpresed as paths.
So:
https\://example\.com:query
Have you tried just quoting?
`url:"http://www..."
They are usually used for phrase queries but they can actually be used here. It will take your url as a whole and send it to the tokenizer.
If you get an error regarding the lack of positions let me know.
@fulmicoton quotes work for the value, but not for the key. Keys require escaping with \
,at least with :
and .
Characters
Describe the bug
I have a
json
field and want to filter through these objects. However, my JSON has HTTP URLs in its keys and values, like so:Say we want to do a query filter that does the equivalent of
some_key:some_value
:https://example.com/some_key:https://example.com/some_value
Parsing these URLs using
QueryParser::parse_query
creates issues, because the parser treats the:
in the URL as a key-value separator.Apparently you can escape special characters using
\
, but that didn't seem to work:So I think this feature is bugged, or perhaps I misunderstood it. Either way, it's not documented as far as I can see.
UPDATE: It does work if you escape the key, and use double quotes
""
around the value:Preffered solution
I suppose using quotes inside these filters (for keys and values) would be nice. It would be simpler for clients to implement than escaping a list of special characters.
e.g.:
"https://example.com/some?complex&url":"http://example.com/1255"
If allowing double quote escapes is implemented, this should not panic with a
SyntaxError
:Which version of tantivy are you using? 0.19.1
And just because it's not said enough: thank you so much for maintaining this awesome library!