Skip to content

Commit a8d128c

Browse files
pan3793LuciferYang
authored andcommitted
[SPARK-54190][BUILD] Guava dependency governance
### What changes were proposed in this pull request? Remove `connect.guava.version` and use the unified `guava.version`. Strip the unused transitive dependencies of Guava: as mentioned in https://github.com/google/guava/wiki/UseGuavaInYourBuild > Guava has one dependency that is needed for linkage at runtime: > com.google.guava:failureaccess:<version> Remove shaded Guava classes from `spark-connect` jar (reuse shaded Guava included in `spark-network-common`) Fix the shading leaks of the `spark-connect-jvm-client` jar ### Why are the changes needed? 1. Simplify Guava dependency management - now Spark uses a unified Guava version everywhere. 2. Reduce package size, spark-connect jar becomes smaller before (master branch) ``` $ ll jars/spark-connect_2.13-4.2.0-SNAPSHOT.jar -rw-r--r-- 1 chengpan staff 17M Nov 5 11:23 jars/spark-connect_2.13-4.2.0-SNAPSHOT.jar ``` after (this PR) ``` $ ll jars/spark-connect_2.13-4.2.0-SNAPSHOT.jar -rw-r--r-- 1 chengpan staff 13M Nov 5 12:01 jars/spark-connect_2.13-4.2.0-SNAPSHOT.jar ``` 2. Fix the shading leaks for `spark-connect-jvm-client` jar before (master branch) ``` $ jar tf jars/connect-repl/spark-connect-client-jvm_2.13-4.2.0-SNAPSHOT.jar | grep '.class$' | grep -v 'org/apache/spark' | grep -v 'org/sparkproject' | grep -v 'META-INF' javax/annotation/CheckForNull.class javax/annotation/CheckForSigned.class ... ``` after (this PR) ``` $ jar tf jars/connect-repl/spark-connect-client-jvm_2.13-4.2.0-SNAPSHOT.jar | grep '.class$' | grep -v 'org/apache/spark' | grep -v 'org/sparkproject' | grep -v 'META-INF' <no-output> ``` ### Does this PR introduce _any_ user-facing change? Reduce potential class conflict issues for users who use `spark-connect-jvm-client`. ### How was this patch tested? Manually checked, see the above section. Also, manually tested the Connect Server, and Connect JVM client via BeeLine. ``` $ dev/make-distribution.sh --tgz --name guava -Pyarn -Pkubernetes -Phadoop-3 -Phive -Phive-thriftserver $ cd dist $ SPARK_NO_DAEMONIZE=1 sbin/start-connect-server.sh ``` ``` $ SPARK_CONNECT_BEELINE=1 bin/beeline -u jdbc:sc://localhost:15002 -e "select 'Hello, Spark Connect!', version() as server_version;" WARNING: Using incubator modules: jdk.incubator.vector Connecting to jdbc:sc://localhost:15002 Connected to: Apache Spark Connect Server (version 4.2.0-SNAPSHOT) Driver: Apache Spark Connect JDBC Driver (version 4.2.0-SNAPSHOT) Error: Requested transaction isolation level REPEATABLE_READ is not supported (state=,code=0) Using Spark's default log4j profile: org/apache/spark/log4j2-defaults.properties 25/11/05 13:30:03 WARN Utils: Your hostname, H27212-MAC-01.local, resolves to a loopback address: 127.0.0.1; using 10.242.159.140 instead (on interface en0) 25/11/05 13:30:03 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address +------------------------+-------------------------------------------------+ | Hello, Spark Connect! | server_version | +------------------------+-------------------------------------------------+ | Hello, Spark Connect! | 4.2.0 0ea7f55 | +------------------------+-------------------------------------------------+ 1 row selected (0.09 seconds) Beeline version 2.3.10 by Apache Hive Closing: 0: jdbc:sc://localhost:15002 ``` ### Was this patch authored or co-authored using generative AI tooling? No. Closes #52873 from pan3793/guava-govern. Authored-by: Cheng Pan <[email protected]> Signed-off-by: yangjie01 <[email protected]>
1 parent c69e999 commit a8d128c

File tree

10 files changed

+59
-494
lines changed

10 files changed

+59
-494
lines changed

LICENSE-binary

Lines changed: 0 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -215,10 +215,8 @@ com.github.joshelser:dropwizard-metrics-hadoop-metrics2-reporter
215215
com.google.code.findbugs:jsr305
216216
com.google.code.gson:gson
217217
com.google.crypto.tink:tink
218-
com.google.errorprone:error_prone_annotations
219218
com.google.flatbuffers:flatbuffers-java
220219
com.google.guava:guava
221-
com.google.j2objc:j2objc-annotations
222220
com.jamesmurty.utils:java-xmlbuilder
223221
com.ning:compress-lzf
224222
com.squareup.okhttp3:logging-interceptor
@@ -478,7 +476,6 @@ dev.ludovic.netlib:blas
478476
dev.ludovic.netlib:arpack
479477
dev.ludovic.netlib:lapack
480478
net.razorvine:pickle
481-
org.checkerframework:checker-qual
482479
org.typelevel:algebra_2.13:jar
483480
org.typelevel:cats-kernel_2.13
484481
org.typelevel:spire_2.13

assembly/pom.xml

Lines changed: 14 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -142,11 +142,25 @@
142142
Because we don't shade dependencies anymore, we need to restore Guava to compile scope so
143143
that the libraries Spark depend on have it available. We'll package the version that Spark
144144
uses which is not the same as Hadoop dependencies, but works.
145+
As mentioned in https://github.com/google/guava/wiki/UseGuavaInYourBuild
146+
Guava has one dependency that is needed for linkage at runtime:
147+
com.google.guava:failureaccess:<version>
145148
-->
146149
<dependency>
147150
<groupId>com.google.guava</groupId>
148151
<artifactId>guava</artifactId>
149152
<scope>${hadoop.deps.scope}</scope>
153+
<exclusions>
154+
<exclusion>
155+
<groupId>*</groupId>
156+
<artifactId>*</artifactId>
157+
</exclusion>
158+
</exclusions>
159+
</dependency>
160+
<dependency>
161+
<groupId>com.google.guava</groupId>
162+
<artifactId>failureaccess</artifactId>
163+
<scope>${hadoop.deps.scope}</scope>
150164
</dependency>
151165
</dependencies>
152166

common/network-common/pom.xml

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -123,6 +123,11 @@
123123
<artifactId>guava</artifactId>
124124
<scope>compile</scope>
125125
</dependency>
126+
<dependency>
127+
<groupId>com.google.guava</groupId>
128+
<artifactId>failureaccess</artifactId>
129+
<scope>compile</scope>
130+
</dependency>
126131
<dependency>
127132
<groupId>org.apache.commons</groupId>
128133
<artifactId>commons-crypto</artifactId>

dev/deps/spark-deps-hadoop-3-hive-2.3

Lines changed: 0 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -32,7 +32,6 @@ breeze-macros_2.13/2.1.0//breeze-macros_2.13-2.1.0.jar
3232
breeze_2.13/2.1.0//breeze_2.13-2.1.0.jar
3333
bundle/2.29.52//bundle-2.29.52.jar
3434
cats-kernel_2.13/2.8.0//cats-kernel_2.13-2.8.0.jar
35-
checker-qual/3.43.0//checker-qual-3.43.0.jar
3635
chill-java/0.10.0//chill-java-0.10.0.jar
3736
chill_2.13/0.10.0//chill_2.13-0.10.0.jar
3837
commons-cli/1.10.0//commons-cli-1.10.0.jar
@@ -61,7 +60,6 @@ derby/10.16.1.1//derby-10.16.1.1.jar
6160
derbyshared/10.16.1.1//derbyshared-10.16.1.1.jar
6261
derbytools/10.16.1.1//derbytools-10.16.1.1.jar
6362
dropwizard-metrics-hadoop-metrics2-reporter/0.1.2//dropwizard-metrics-hadoop-metrics2-reporter-0.1.2.jar
64-
error_prone_annotations/2.36.0//error_prone_annotations-2.36.0.jar
6563
esdk-obs-java/3.20.4.2//esdk-obs-java-3.20.4.2.jar
6664
failureaccess/1.0.2//failureaccess-1.0.2.jar
6765
flatbuffers-java/25.2.10//flatbuffers-java-25.2.10.jar
@@ -101,7 +99,6 @@ icu4j/77.1//icu4j-77.1.jar
10199
ini4j/0.5.4//ini4j-0.5.4.jar
102100
istack-commons-runtime/4.1.2//istack-commons-runtime-4.1.2.jar
103101
ivy/2.5.3//ivy-2.5.3.jar
104-
j2objc-annotations/3.0.0//j2objc-annotations-3.0.0.jar
105102
jackson-annotations/2.20//jackson-annotations-2.20.jar
106103
jackson-core/2.20.0//jackson-core-2.20.0.jar
107104
jackson-databind/2.20.0//jackson-databind-2.20.0.jar
@@ -184,7 +181,6 @@ lapack/3.0.4//lapack-3.0.4.jar
184181
leveldbjni-all/1.8//leveldbjni-all-1.8.jar
185182
libfb303/0.9.3//libfb303-0.9.3.jar
186183
libthrift/0.16.0//libthrift-0.16.0.jar
187-
listenablefuture/9999.0-empty-to-avoid-conflict-with-guava//listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar
188184
log4j-1.2-api/2.24.3//log4j-1.2-api-2.24.3.jar
189185
log4j-api/2.24.3//log4j-api-2.24.3.jar
190186
log4j-core/2.24.3//log4j-core-2.24.3.jar

licenses-binary/LICENSE-check-qual.txt

Lines changed: 0 additions & 413 deletions
This file was deleted.

pom.xml

Lines changed: 7 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -199,6 +199,7 @@
199199
<commons-pool2.version>2.12.1</commons-pool2.version>
200200
<datanucleus-core.version>4.1.17</datanucleus-core.version>
201201
<guava.version>33.4.0-jre</guava.version>
202+
<guava.failureaccess.version>1.0.2</guava.failureaccess.version>
202203
<gson.version>2.11.0</gson.version>
203204
<janino.version>3.1.9</janino.version>
204205
<jersey.version>3.0.18</jersey.version>
@@ -303,8 +304,6 @@
303304
<spark.test.docker.removePulledImage>true</spark.test.docker.removePulledImage>
304305

305306
<!-- Version used in Connect -->
306-
<connect.guava.version>33.4.0-jre</connect.guava.version>
307-
<guava.failureaccess.version>1.0.2</guava.failureaccess.version>
308307
<io.grpc.version>1.76.0</io.grpc.version>
309308
<mima.version>1.1.4</mima.version>
310309

@@ -621,6 +620,12 @@
621620
<version>${guava.version}</version>
622621
<scope>provided</scope>
623622
</dependency>
623+
<dependency>
624+
<groupId>com.google.guava</groupId>
625+
<artifactId>failureaccess</artifactId>
626+
<version>${guava.failureaccess.version}</version>
627+
<scope>provided</scope>
628+
</dependency>
624629
<dependency>
625630
<groupId>org.jpmml</groupId>
626631
<artifactId>pmml-model</artifactId>

project/SparkBuild.scala

Lines changed: 26 additions & 30 deletions
Original file line numberDiff line numberDiff line change
@@ -673,7 +673,7 @@ object SparkConnectCommon {
673673
libraryDependencies ++= {
674674
val guavaVersion =
675675
SbtPomKeys.effectivePom.value.getProperties.get(
676-
"connect.guava.version").asInstanceOf[String]
676+
"guava.version").asInstanceOf[String]
677677
val guavaFailureaccessVersion =
678678
SbtPomKeys.effectivePom.value.getProperties.get(
679679
"guava.failureaccess.version").asInstanceOf[String]
@@ -691,7 +691,7 @@ object SparkConnectCommon {
691691
dependencyOverrides ++= {
692692
val guavaVersion =
693693
SbtPomKeys.effectivePom.value.getProperties.get(
694-
"connect.guava.version").asInstanceOf[String]
694+
"guava.version").asInstanceOf[String]
695695
val guavaFailureaccessVersion =
696696
SbtPomKeys.effectivePom.value.getProperties.get(
697697
"guava.failureaccess.version").asInstanceOf[String]
@@ -759,7 +759,7 @@ object SparkConnect {
759759
libraryDependencies ++= {
760760
val guavaVersion =
761761
SbtPomKeys.effectivePom.value.getProperties.get(
762-
"connect.guava.version").asInstanceOf[String]
762+
"guava.version").asInstanceOf[String]
763763
val guavaFailureaccessVersion =
764764
SbtPomKeys.effectivePom.value.getProperties.get(
765765
"guava.failureaccess.version").asInstanceOf[String]
@@ -773,7 +773,7 @@ object SparkConnect {
773773
dependencyOverrides ++= {
774774
val guavaVersion =
775775
SbtPomKeys.effectivePom.value.getProperties.get(
776-
"connect.guava.version").asInstanceOf[String]
776+
"guava.version").asInstanceOf[String]
777777
val guavaFailureaccessVersion =
778778
SbtPomKeys.effectivePom.value.getProperties.get(
779779
"guava.failureaccess.version").asInstanceOf[String]
@@ -791,34 +791,26 @@ object SparkConnect {
791791
// Exclude `scala-library` from assembly.
792792
(assembly / assemblyPackageScala / assembleArtifact) := false,
793793

794-
// SPARK-46733: Include `spark-connect-*.jar`, `unused-*.jar`,`guava-*.jar`,
795-
// `failureaccess-*.jar`, `annotations-*.jar`, `grpc-*.jar`, `protobuf-*.jar`,
796-
// `gson-*.jar`, `error_prone_annotations-*.jar`, `j2objc-annotations-*.jar`,
797-
// `animal-sniffer-annotations-*.jar`, `perfmark-api-*.jar`,
798-
// `proto-google-common-protos-*.jar` in assembly.
794+
// SPARK-46733: Include `spark-connect-*.jar`, `unused-*.jar`, `annotations-*.jar`,
795+
// `grpc-*.jar`, `protobuf-*.jar`, `gson-*.jar`, `animal-sniffer-annotations-*.jar`,
796+
// `perfmark-api-*.jar`, `proto-google-common-protos-*.jar` in assembly.
799797
// This needs to be consistent with the content of `maven-shade-plugin`.
800798
(assembly / assemblyExcludedJars) := {
801799
val cp = (assembly / fullClasspath).value
802-
val validPrefixes = Set("spark-connect", "unused-", "guava-", "failureaccess-",
803-
"annotations-", "grpc-", "protobuf-", "gson", "error_prone_annotations",
804-
"j2objc-annotations", "animal-sniffer-annotations", "perfmark-api",
805-
"proto-google-common-protos")
800+
val validPrefixes = Set("spark-connect", "unused-", "annotations-",
801+
"grpc-", "protobuf-", "gson", "animal-sniffer-annotations",
802+
"perfmark-api", "proto-google-common-protos")
806803
cp filterNot { v =>
807804
validPrefixes.exists(v.data.getName.startsWith)
808805
}
809806
},
810807

811808
(assembly / assemblyShadeRules) := Seq(
812809
ShadeRule.rename("io.grpc.**" -> "org.sparkproject.connect.grpc.@1").inAll,
813-
ShadeRule.rename("com.google.common.**" -> "org.sparkproject.connect.guava.@1").inAll,
814-
ShadeRule.rename("com.google.thirdparty.**" -> "org.sparkproject.connect.guava.@1").inAll,
815810
ShadeRule.rename("com.google.protobuf.**" -> "org.sparkproject.connect.protobuf.@1").inAll,
816811
ShadeRule.rename("android.annotation.**" -> "org.sparkproject.connect.android_annotation.@1").inAll,
817812
ShadeRule.rename("io.perfmark.**" -> "org.sparkproject.connect.io_perfmark.@1").inAll,
818813
ShadeRule.rename("org.codehaus.mojo.animal_sniffer.**" -> "org.sparkproject.connect.animal_sniffer.@1").inAll,
819-
ShadeRule.rename("com.google.j2objc.annotations.**" -> "org.sparkproject.connect.j2objc_annotations.@1").inAll,
820-
ShadeRule.rename("com.google.errorprone.annotations.**" -> "org.sparkproject.connect.errorprone_annotations.@1").inAll,
821-
ShadeRule.rename("org.checkerframework.**" -> "org.sparkproject.connect.checkerframework.@1").inAll,
822814
ShadeRule.rename("com.google.gson.**" -> "org.sparkproject.connect.gson.@1").inAll,
823815
ShadeRule.rename("com.google.api.**" -> "org.sparkproject.connect.google_protos.api.@1").inAll,
824816
ShadeRule.rename("com.google.apps.**" -> "org.sparkproject.connect.google_protos.apps.@1").inAll,
@@ -852,7 +844,7 @@ object SparkConnectJdbc {
852844
libraryDependencies ++= {
853845
val guavaVersion =
854846
SbtPomKeys.effectivePom.value.getProperties.get(
855-
"connect.guava.version").asInstanceOf[String]
847+
"guava.version").asInstanceOf[String]
856848
Seq(
857849
"com.google.guava" % "guava" % guavaVersion,
858850
"com.google.protobuf" % "protobuf-java" % protoVersion % "protobuf"
@@ -861,7 +853,7 @@ object SparkConnectJdbc {
861853
dependencyOverrides ++= {
862854
val guavaVersion =
863855
SbtPomKeys.effectivePom.value.getProperties.get(
864-
"connect.guava.version").asInstanceOf[String]
856+
"guava.version").asInstanceOf[String]
865857
Seq(
866858
"com.google.guava" % "guava" % guavaVersion,
867859
"com.google.protobuf" % "protobuf-java" % protoVersion
@@ -889,14 +881,17 @@ object SparkConnectJdbc {
889881
// Exclude `scala-library` from assembly.
890882
(assembly / assemblyPackageScala / assembleArtifact) := false,
891883

892-
// Exclude `pmml-model-*.jar`, `scala-collection-compat_*.jar`,`jsr305-*.jar` and
893-
// `netty-*.jar` and `unused-1.0.0.jar` from assembly.
884+
// Exclude `pmml-model-*.jar`, `scala-collection-compat_*.jar`, `jsr305-*.jar`,
885+
// `error_prone_annotations-*.jar`, `listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar`,
886+
// `j2objc-annotations-*.jar`, `checker-qual-*.jar` and `unused-1.0.0.jar` from assembly.
894887
(assembly / assemblyExcludedJars) := {
895888
val cp = (assembly / fullClasspath).value
896889
cp filter { v =>
897890
val name = v.data.getName
898891
name.startsWith("pmml-model-") || name.startsWith("scala-collection-compat_") ||
899-
name.startsWith("jsr305-") || name == "unused-1.0.0.jar"
892+
name.startsWith("jsr305-") || name.startsWith("error_prone_annotations") ||
893+
name.startsWith("listenablefuture") || name.startsWith("j2objc-annotations") ||
894+
name.startsWith("checker-qual") || name == "unused-1.0.0.jar"
900895
}
901896
},
902897
// Only include `spark-connect-client-jdbc-*.jar`
@@ -913,7 +908,6 @@ object SparkConnectJdbc {
913908
ShadeRule.rename("io.grpc.**" -> "org.sparkproject.connect.client.io.grpc.@1").inAll,
914909
ShadeRule.rename("com.google.**" -> "org.sparkproject.connect.client.com.google.@1").inAll,
915910
ShadeRule.rename("io.netty.**" -> "org.sparkproject.connect.client.io.netty.@1").inAll,
916-
ShadeRule.rename("org.checkerframework.**" -> "org.sparkproject.connect.client.org.checkerframework.@1").inAll,
917911
ShadeRule.rename("io.perfmark.**" -> "org.sparkproject.connect.client.io.perfmark.@1").inAll,
918912
ShadeRule.rename("org.codehaus.**" -> "org.sparkproject.connect.client.org.codehaus.@1").inAll,
919913
ShadeRule.rename("android.annotation.**" -> "org.sparkproject.connect.client.android.annotation.@1").inAll
@@ -940,7 +934,7 @@ object SparkConnectClient {
940934
libraryDependencies ++= {
941935
val guavaVersion =
942936
SbtPomKeys.effectivePom.value.getProperties.get(
943-
"connect.guava.version").asInstanceOf[String]
937+
"guava.version").asInstanceOf[String]
944938
Seq(
945939
"com.google.guava" % "guava" % guavaVersion,
946940
"com.google.protobuf" % "protobuf-java" % protoVersion % "protobuf"
@@ -949,7 +943,7 @@ object SparkConnectClient {
949943
dependencyOverrides ++= {
950944
val guavaVersion =
951945
SbtPomKeys.effectivePom.value.getProperties.get(
952-
"connect.guava.version").asInstanceOf[String]
946+
"guava.version").asInstanceOf[String]
953947
Seq(
954948
"com.google.guava" % "guava" % guavaVersion,
955949
"com.google.protobuf" % "protobuf-java" % protoVersion
@@ -977,22 +971,24 @@ object SparkConnectClient {
977971
// Exclude `scala-library` from assembly.
978972
(assembly / assemblyPackageScala / assembleArtifact) := false,
979973

980-
// Exclude `pmml-model-*.jar`, `scala-collection-compat_*.jar`,`jsr305-*.jar` and
981-
// `netty-*.jar` and `unused-1.0.0.jar` from assembly.
974+
// Exclude `pmml-model-*.jar`, `scala-collection-compat_*.jar`, `jsr305-*.jar`,
975+
// `error_prone_annotations-*.jar`, `listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar`,
976+
// `j2objc-annotations-*.jar`, `checker-qual-*.jar` and `unused-1.0.0.jar` from assembly.
982977
(assembly / assemblyExcludedJars) := {
983978
val cp = (assembly / fullClasspath).value
984979
cp filter { v =>
985980
val name = v.data.getName
986981
name.startsWith("pmml-model-") || name.startsWith("scala-collection-compat_") ||
987-
name.startsWith("jsr305-") || name == "unused-1.0.0.jar"
982+
name.startsWith("jsr305-") || name.startsWith("error_prone_annotations") ||
983+
name.startsWith("listenablefuture") || name.startsWith("j2objc-annotations") ||
984+
name.startsWith("checker-qual") || name == "unused-1.0.0.jar"
988985
}
989986
},
990987

991988
(assembly / assemblyShadeRules) := Seq(
992989
ShadeRule.rename("io.grpc.**" -> "org.sparkproject.connect.client.io.grpc.@1").inAll,
993990
ShadeRule.rename("com.google.**" -> "org.sparkproject.connect.client.com.google.@1").inAll,
994991
ShadeRule.rename("io.netty.**" -> "org.sparkproject.connect.client.io.netty.@1").inAll,
995-
ShadeRule.rename("org.checkerframework.**" -> "org.sparkproject.connect.client.org.checkerframework.@1").inAll,
996992
ShadeRule.rename("io.perfmark.**" -> "org.sparkproject.connect.client.io.perfmark.@1").inAll,
997993
ShadeRule.rename("org.codehaus.**" -> "org.sparkproject.connect.client.org.codehaus.@1").inAll,
998994
ShadeRule.rename("android.annotation.**" -> "org.sparkproject.connect.client.android.annotation.@1").inAll

sql/connect/client/jdbc/pom.xml

Lines changed: 0 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -77,13 +77,11 @@
7777
<dependency>
7878
<groupId>com.google.guava</groupId>
7979
<artifactId>guava</artifactId>
80-
<version>${connect.guava.version}</version>
8180
<scope>compile</scope>
8281
</dependency>
8382
<dependency>
8483
<groupId>com.google.guava</groupId>
8584
<artifactId>failureaccess</artifactId>
86-
<version>${guava.failureaccess.version}</version>
8785
<scope>compile</scope>
8886
</dependency>
8987
<dependency>
@@ -177,10 +175,6 @@
177175
<pattern>io.netty</pattern>
178176
<shadedPattern>${spark.shade.packageName}.io.netty</shadedPattern>
179177
</relocation>
180-
<relocation>
181-
<pattern>org.checkerframework</pattern>
182-
<shadedPattern>${spark.shade.packageName}.org.checkerframework</shadedPattern>
183-
</relocation>
184178
<relocation>
185179
<pattern>io.perfmark</pattern>
186180
<shadedPattern>${spark.shade.packageName}.io.perfmark</shadedPattern>

sql/connect/client/jvm/pom.xml

Lines changed: 0 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -77,13 +77,11 @@
7777
<dependency>
7878
<groupId>com.google.guava</groupId>
7979
<artifactId>guava</artifactId>
80-
<version>${connect.guava.version}</version>
8180
<scope>compile</scope>
8281
</dependency>
8382
<dependency>
8483
<groupId>com.google.guava</groupId>
8584
<artifactId>failureaccess</artifactId>
86-
<version>${guava.failureaccess.version}</version>
8785
<scope>compile</scope>
8886
</dependency>
8987
<!--
@@ -178,18 +176,14 @@
178176
<include>com.google.guava:*</include>
179177
<include>com.google.android:*</include>
180178
<include>com.google.api.grpc:*</include>
181-
<include>com.google.code.findbugs:*</include>
182179
<include>com.google.code.gson:*</include>
183-
<include>com.google.errorprone:*</include>
184-
<include>com.google.j2objc:*</include>
185180
<include>com.google.protobuf:*</include>
186181
<include>com.google.flatbuffers:*</include>
187182
<include>io.grpc:*</include>
188183
<include>io.netty:*</include>
189184
<include>io.perfmark:*</include>
190185
<include>org.apache.arrow:*</include>
191186
<include>org.codehaus.mojo:*</include>
192-
<include>org.checkerframework:*</include>
193187
<include>org.apache.spark:spark-connect-common_${scala.binary.version}</include>
194188
<include>org.apache.spark:spark-sql-api_${scala.binary.version}</include>
195189
</includes>
@@ -221,10 +215,6 @@
221215
<pattern>io.netty</pattern>
222216
<shadedPattern>${spark.shade.packageName}.io.netty</shadedPattern>
223217
</relocation>
224-
<relocation>
225-
<pattern>org.checkerframework</pattern>
226-
<shadedPattern>${spark.shade.packageName}.org.checkerframework</shadedPattern>
227-
</relocation>
228218
<relocation>
229219
<pattern>io.perfmark</pattern>
230220
<shadedPattern>${spark.shade.packageName}.io.perfmark</shadedPattern>

0 commit comments

Comments
 (0)