Description of changes:
CustomSQL Analyzer test was failing with Spark 3.3.0 with the error:
org.scalatest.exceptions.TestFailedException: "Column 'foo' does not exist. Did you mean one of the following? [primary.Country, primary.Address Line 1, primary.Address Line 2, primary.Address Line 3]; line 1 pos 7;
while other versions (3.1.0, 3.4.0) fails with:
"cannot resolve 'foo' given input columns: [primary.Address Line 1, primary.Address Line 2, primary.Address Line 3, primary.Country]; line 1 pos 7;
The test was checking that `foo` was in the exception, but for Spark 3.3.0 the error message showed 'foo' instead.
Modifying the test to pass for all versions
By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.
Issue #, if available:
Description of changes: CustomSQL Analyzer test was failing with Spark 3.3.0 with the error:
org.scalatest.exceptions.TestFailedException: "Column 'foo' does not exist. Did you mean one of the following? [primary.Country, primary.Address Line 1, primary.Address Line 2, primary.Address Line 3]; line 1 pos 7;
while other versions (3.1.0, 3.4.0) fails with:"cannot resolve '
foo' given input columns: [primary.Address Line 1, primary.Address Line 2, primary.Address Line 3, primary.Country]; line 1 pos 7;
The test was checking that `foo` was in the exception, but for Spark 3.3.0 the error message showed 'foo' instead. Modifying the test to pass for all versionsBy submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.