Closed wwang500 closed 6 months ago
While I haven't got to the root cause of this issue I have obtained what may be something of a clue.
Additional debug messages were printed from the CJsonOutputStreamWrapper
class by way of adding an additional member
std::ostringstream m_DebugStream;
and writing to it in addition to the ostream
in m_ConcurrentOutputStream
, then logging the contents of m_DebugStream
in CJsonOutputStreamWrapper::flush
This does produce a lot of noise but one thing of particular interest is the compressed_inference_model
string. Copying the contents of that string to a file, say cim.txt
and then decoding it with
base64 -d -i cim.txt -o - | gunzip -c
gives the raw JSON document. When attempting to validate this document using say jq
, gives an error.
parse error: Invalid escape at line 1, column 359
Using an IDE, such as CLion
to format the JSON, shows that there are unescaped special characters in the keys, e.g.
"contains\ and a /": 0.050394797956339991,
Here, the backslash needs escaping.
I think a solution may be to use boost::json::serialize
on the keys, in a similar way to how it is done for string values.
I think a solution may be to use
boost::json::serialize
on the keys, in a similar way to how it is done for string values.
Yes, that sounds highly likely, especially as the failing test is called "special_characters_2019".
It looks like both these places need changing:
and:
They both need to be more like:
Version
Latest 8.13.0-SNAPSHOT
Step to reproduce
restore
special_characters_2019
dataset from our qa snapshot bucketCreate a DFA job with the configue:
Start job
Observed:
Job will fail with the message:
Updated analytics task state to [failed] with reason [[dfa_classification_special_char_2019_1707902226_000_0] failed running inference on model [dfa_classification_special_char_2019_1707902226_000_0-1707920342557]; cause was [failed to load model [dfa_classification_special_char_2019_1707902226_000_0-1707920342557] definition]] .
in es log: