vertica / spark-connector

This component acts as a bridge between Spark and Vertica, allowing the user to either retrieve data from Vertica for processing in Spark, or store processed data from Spark into Vertica.
Apache License 2.0
20 stars 23 forks source link

Adding a dependancy exclusion on Hadoop's jersey-server to fix a Spark UI issue #524

Closed ai-bq closed 1 year ago

ai-bq commented 1 year ago

Summary

We've had an issue raised wherein the Spark UI shows blank and returns a stacktrace when trying to look at the executors tab in the Spark Worker UI as it runs a job.

Description

It seems that the connector's two dependancies Spark and Hadoop contained different packages of jersey-server. Excluding Hadoop's seems to fix the issue without problem.

I've also added the ability to run sbt dependencyTree so we can observe dependancies closer in the future.

Related Issue

Closes #522.

Additional Reviewers

@alexey-temnikov @alexr-bq @jonathanl-bq @jeremyp-bq

codecov[bot] commented 1 year ago

Codecov Report

Merging #524 (9aedb20) into main (7825401) will not change coverage. The diff coverage is n/a.

@@           Coverage Diff           @@
##             main     #524   +/-   ##
=======================================
  Coverage   87.58%   87.58%           
=======================================
  Files          44       44           
  Lines        2005     2005           
  Branches      122      122           
=======================================
  Hits         1756     1756           
  Misses        249      249           
Flag Coverage Δ
unittests 87.58% <ø> (ø)

Flags with carried forward coverage won't be shown. Click here to find out more.

:mega: We’re building smart automated test selection to slash your CI/CD build times. Learn more

ai-bq commented 1 year ago

Along with the standard GitHub checks, I ran the weekly and nightly tests on this branch. I checked the various functional tests locally across various Spark versions to ensure the Spark Worker UI works for those as well.