[2024/01/19 13:57:13.866 +00:00] [ERROR] [snapshot.go:125] ["Failed to load snapshot data into data warehouse"] [table=database.table] [error="Bigquery load snapshot job completed with error: {Location: \"\"; Message: \"Error while reading data, error message: CSV processing encountered too many errors, giving up.
Error in bigquery logs:
Error while reading data, error message: Bad character (ASCII 0) encountered.; line_number: 59829 byte_offset_to_start_of_line: 8544220 column_index: 1 column_name: "text" column_type: STRING value: "Parser exit code ..."
[2024/01/19 13:57:13.866 +00:00] [ERROR] [snapshot.go:125] ["Failed to load snapshot data into data warehouse"] [table=database.table] [error="Bigquery load snapshot job completed with error: {Location: \"\"; Message: \"Error while reading data, error message: CSV processing encountered too many errors, giving up.
Error in bigquery logs: Error while reading data, error message: Bad character (ASCII 0) encountered.; line_number: 59829 byte_offset_to_start_of_line: 8544220 column_index: 1 column_name: "text" column_type: STRING value: "Parser exit code ..."
Fixed by removing bad symbols from csv:
Perhaps it would be beneficial to include a feature in the tool that validates and removes any invalid characters?