I submitted the StormEvents.csv file (available as an example in the Data Explorer docs) to test the ingestion and the ingestion failed.
Steps:
install connector, start kafka connect worker
start the connector job
download StormEvents.csv file locall
pipe its contents to kafka topic using console producer. e.g. kafka-console-producer.sh --broker-list localhost:9092 --topic my_topic --new-producer < StormEvents.csv
This should trigger the ingestion process.
In Data Explorer, .show ingestion failures returned the following error:
Stream_WrongNumberOfFields: Stream with id '19409a9d7dcb455e8692bad50d4d0b5e-httpsrldabhishgudatax00.blob.core.windows.netch7-20200817-temp-e5c334ee145d4b43a3a2d3a96fbac1dftestkustodb__StormEvents__7fc3b904-0702-4e86-a578-3516e06d34fb__kafka_storm-events_0_9184.csv.gz.csv.gz' has a malformed Csv format, failing per ValidationOptions policy with errorCode='0x80DA0008 E_WRONG_NUMBER_OF_FIELDS'
Additional information: HRESULT=0x80da0008
Record=1
(E_WRONG_NUMBER_OF_FIELDS)
Validator=struct Kusto::Csv::CsvFormatValidatingParserTraits
Fragment=
Kusto::Csv::Parser<>.PrepareFields: CSV has an inconsistent number of fields per line: -- Offending record: 2 (start position in stream: 17), fieldsCount: 2, currentRecordFieldCount: 2, record: ""Description"": """",
[end record]
I submitted the
StormEvents.csv
file (available as an example in the Data Explorer docs) to test the ingestion and the ingestion failed.Steps:
StormEvents.csv
file locallkafka-console-producer.sh --broker-list localhost:9092 --topic my_topic --new-producer < StormEvents.csv
In Data Explorer,
.show ingestion failures
returned the following error: