Skip to content
Closed
Show file tree
Hide file tree
Changes from 5 commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 6 additions & 0 deletions .github/workflows/build_and_test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -1376,3 +1376,9 @@ jobs:
cd ui-test
npm install --save-dev
node --experimental-vm-modules node_modules/.bin/jest

maven-test:
permissions:
packages: write
name: Run
uses: ./.github/workflows/maven_test.yml
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

will revert after passing maven tests

3 changes: 0 additions & 3 deletions LICENSE-binary
Original file line number Diff line number Diff line change
Expand Up @@ -215,10 +215,8 @@ com.github.joshelser:dropwizard-metrics-hadoop-metrics2-reporter
com.google.code.findbugs:jsr305
com.google.code.gson:gson
com.google.crypto.tink:tink
com.google.errorprone:error_prone_annotations
com.google.flatbuffers:flatbuffers-java
com.google.guava:guava
com.google.j2objc:j2objc-annotations
com.jamesmurty.utils:java-xmlbuilder
com.ning:compress-lzf
com.squareup.okhttp3:logging-interceptor
Expand Down Expand Up @@ -478,7 +476,6 @@ dev.ludovic.netlib:blas
dev.ludovic.netlib:arpack
dev.ludovic.netlib:lapack
net.razorvine:pickle
org.checkerframework:checker-qual
org.typelevel:algebra_2.13:jar
org.typelevel:cats-kernel_2.13
org.typelevel:spire_2.13
Expand Down
14 changes: 14 additions & 0 deletions assembly/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -142,11 +142,25 @@
Because we don't shade dependencies anymore, we need to restore Guava to compile scope so
that the libraries Spark depend on have it available. We'll package the version that Spark
uses which is not the same as Hadoop dependencies, but works.
As mentioned in https://github.com/google/guava/wiki/UseGuavaInYourBuild
Guava has one dependency that is needed for linkage at runtime:
com.google.guava:failureaccess:<version>
-->
<dependency>
<groupId>com.google.guava</groupId>
<artifactId>guava</artifactId>
<scope>${hadoop.deps.scope}</scope>
<exclusions>
<exclusion>
<groupId>*</groupId>
<artifactId>*</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>com.google.guava</groupId>
<artifactId>failureaccess</artifactId>
<scope>${hadoop.deps.scope}</scope>
</dependency>
</dependencies>

Expand Down
5 changes: 5 additions & 0 deletions common/network-common/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -123,6 +123,11 @@
<artifactId>guava</artifactId>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>com.google.guava</groupId>
<artifactId>failureaccess</artifactId>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>org.apache.commons</groupId>
<artifactId>commons-crypto</artifactId>
Expand Down
4 changes: 0 additions & 4 deletions dev/deps/spark-deps-hadoop-3-hive-2.3
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,6 @@ breeze-macros_2.13/2.1.0//breeze-macros_2.13-2.1.0.jar
breeze_2.13/2.1.0//breeze_2.13-2.1.0.jar
bundle/2.29.52//bundle-2.29.52.jar
cats-kernel_2.13/2.8.0//cats-kernel_2.13-2.8.0.jar
checker-qual/3.43.0//checker-qual-3.43.0.jar
chill-java/0.10.0//chill-java-0.10.0.jar
chill_2.13/0.10.0//chill_2.13-0.10.0.jar
commons-cli/1.10.0//commons-cli-1.10.0.jar
Expand Down Expand Up @@ -61,7 +60,6 @@ derby/10.16.1.1//derby-10.16.1.1.jar
derbyshared/10.16.1.1//derbyshared-10.16.1.1.jar
derbytools/10.16.1.1//derbytools-10.16.1.1.jar
dropwizard-metrics-hadoop-metrics2-reporter/0.1.2//dropwizard-metrics-hadoop-metrics2-reporter-0.1.2.jar
error_prone_annotations/2.36.0//error_prone_annotations-2.36.0.jar
esdk-obs-java/3.20.4.2//esdk-obs-java-3.20.4.2.jar
failureaccess/1.0.2//failureaccess-1.0.2.jar
flatbuffers-java/25.2.10//flatbuffers-java-25.2.10.jar
Expand Down Expand Up @@ -101,7 +99,6 @@ icu4j/77.1//icu4j-77.1.jar
ini4j/0.5.4//ini4j-0.5.4.jar
istack-commons-runtime/4.1.2//istack-commons-runtime-4.1.2.jar
ivy/2.5.3//ivy-2.5.3.jar
j2objc-annotations/3.0.0//j2objc-annotations-3.0.0.jar
jackson-annotations/2.20//jackson-annotations-2.20.jar
jackson-core/2.20.0//jackson-core-2.20.0.jar
jackson-databind/2.20.0//jackson-databind-2.20.0.jar
Expand Down Expand Up @@ -184,7 +181,6 @@ lapack/3.0.4//lapack-3.0.4.jar
leveldbjni-all/1.8//leveldbjni-all-1.8.jar
libfb303/0.9.3//libfb303-0.9.3.jar
libthrift/0.16.0//libthrift-0.16.0.jar
listenablefuture/9999.0-empty-to-avoid-conflict-with-guava//listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar
log4j-1.2-api/2.24.3//log4j-1.2-api-2.24.3.jar
log4j-api/2.24.3//log4j-api-2.24.3.jar
log4j-core/2.24.3//log4j-core-2.24.3.jar
Expand Down
413 changes: 0 additions & 413 deletions licenses-binary/LICENSE-check-qual.txt

This file was deleted.

9 changes: 7 additions & 2 deletions pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -199,6 +199,7 @@
<commons-pool2.version>2.12.1</commons-pool2.version>
<datanucleus-core.version>4.1.17</datanucleus-core.version>
<guava.version>33.4.0-jre</guava.version>
<guava.failureaccess.version>1.0.2</guava.failureaccess.version>
<gson.version>2.11.0</gson.version>
<janino.version>3.1.9</janino.version>
<jersey.version>3.0.18</jersey.version>
Expand Down Expand Up @@ -303,8 +304,6 @@
<spark.test.docker.removePulledImage>true</spark.test.docker.removePulledImage>

<!-- Version used in Connect -->
<connect.guava.version>33.4.0-jre</connect.guava.version>
<guava.failureaccess.version>1.0.2</guava.failureaccess.version>
<io.grpc.version>1.76.0</io.grpc.version>
<mima.version>1.1.4</mima.version>

Expand Down Expand Up @@ -621,6 +620,12 @@
<version>${guava.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>com.google.guava</groupId>
<artifactId>failureaccess</artifactId>
<version>${guava.failureaccess.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.jpmml</groupId>
<artifactId>pmml-model</artifactId>
Expand Down
56 changes: 26 additions & 30 deletions project/SparkBuild.scala
Original file line number Diff line number Diff line change
Expand Up @@ -673,7 +673,7 @@ object SparkConnectCommon {
libraryDependencies ++= {
val guavaVersion =
SbtPomKeys.effectivePom.value.getProperties.get(
"connect.guava.version").asInstanceOf[String]
"guava.version").asInstanceOf[String]
val guavaFailureaccessVersion =
SbtPomKeys.effectivePom.value.getProperties.get(
"guava.failureaccess.version").asInstanceOf[String]
Expand All @@ -691,7 +691,7 @@ object SparkConnectCommon {
dependencyOverrides ++= {
val guavaVersion =
SbtPomKeys.effectivePom.value.getProperties.get(
"connect.guava.version").asInstanceOf[String]
"guava.version").asInstanceOf[String]
val guavaFailureaccessVersion =
SbtPomKeys.effectivePom.value.getProperties.get(
"guava.failureaccess.version").asInstanceOf[String]
Expand Down Expand Up @@ -759,7 +759,7 @@ object SparkConnect {
libraryDependencies ++= {
val guavaVersion =
SbtPomKeys.effectivePom.value.getProperties.get(
"connect.guava.version").asInstanceOf[String]
"guava.version").asInstanceOf[String]
val guavaFailureaccessVersion =
SbtPomKeys.effectivePom.value.getProperties.get(
"guava.failureaccess.version").asInstanceOf[String]
Expand All @@ -773,7 +773,7 @@ object SparkConnect {
dependencyOverrides ++= {
val guavaVersion =
SbtPomKeys.effectivePom.value.getProperties.get(
"connect.guava.version").asInstanceOf[String]
"guava.version").asInstanceOf[String]
val guavaFailureaccessVersion =
SbtPomKeys.effectivePom.value.getProperties.get(
"guava.failureaccess.version").asInstanceOf[String]
Expand All @@ -791,34 +791,26 @@ object SparkConnect {
// Exclude `scala-library` from assembly.
(assembly / assemblyPackageScala / assembleArtifact) := false,

// SPARK-46733: Include `spark-connect-*.jar`, `unused-*.jar`,`guava-*.jar`,
// `failureaccess-*.jar`, `annotations-*.jar`, `grpc-*.jar`, `protobuf-*.jar`,
// `gson-*.jar`, `error_prone_annotations-*.jar`, `j2objc-annotations-*.jar`,
// `animal-sniffer-annotations-*.jar`, `perfmark-api-*.jar`,
// `proto-google-common-protos-*.jar` in assembly.
// SPARK-46733: Include `spark-connect-*.jar`, `unused-*.jar`, `annotations-*.jar`,
// `grpc-*.jar`, `protobuf-*.jar`, `gson-*.jar`, `animal-sniffer-annotations-*.jar`,
// `perfmark-api-*.jar`, `proto-google-common-protos-*.jar` in assembly.
// This needs to be consistent with the content of `maven-shade-plugin`.
(assembly / assemblyExcludedJars) := {
val cp = (assembly / fullClasspath).value
val validPrefixes = Set("spark-connect", "unused-", "guava-", "failureaccess-",
"annotations-", "grpc-", "protobuf-", "gson", "error_prone_annotations",
"j2objc-annotations", "animal-sniffer-annotations", "perfmark-api",
"proto-google-common-protos")
val validPrefixes = Set("spark-connect", "unused-", "annotations-",
"grpc-", "protobuf-", "gson", "animal-sniffer-annotations",
"perfmark-api", "proto-google-common-protos")
cp filterNot { v =>
validPrefixes.exists(v.data.getName.startsWith)
}
},

(assembly / assemblyShadeRules) := Seq(
ShadeRule.rename("io.grpc.**" -> "org.sparkproject.connect.grpc.@1").inAll,
ShadeRule.rename("com.google.common.**" -> "org.sparkproject.connect.guava.@1").inAll,
ShadeRule.rename("com.google.thirdparty.**" -> "org.sparkproject.connect.guava.@1").inAll,
ShadeRule.rename("com.google.protobuf.**" -> "org.sparkproject.connect.protobuf.@1").inAll,
ShadeRule.rename("android.annotation.**" -> "org.sparkproject.connect.android_annotation.@1").inAll,
ShadeRule.rename("io.perfmark.**" -> "org.sparkproject.connect.io_perfmark.@1").inAll,
ShadeRule.rename("org.codehaus.mojo.animal_sniffer.**" -> "org.sparkproject.connect.animal_sniffer.@1").inAll,
ShadeRule.rename("com.google.j2objc.annotations.**" -> "org.sparkproject.connect.j2objc_annotations.@1").inAll,
ShadeRule.rename("com.google.errorprone.annotations.**" -> "org.sparkproject.connect.errorprone_annotations.@1").inAll,
ShadeRule.rename("org.checkerframework.**" -> "org.sparkproject.connect.checkerframework.@1").inAll,
ShadeRule.rename("com.google.gson.**" -> "org.sparkproject.connect.gson.@1").inAll,
ShadeRule.rename("com.google.api.**" -> "org.sparkproject.connect.google_protos.api.@1").inAll,
ShadeRule.rename("com.google.apps.**" -> "org.sparkproject.connect.google_protos.apps.@1").inAll,
Expand Down Expand Up @@ -852,7 +844,7 @@ object SparkConnectJdbc {
libraryDependencies ++= {
val guavaVersion =
SbtPomKeys.effectivePom.value.getProperties.get(
"connect.guava.version").asInstanceOf[String]
"guava.version").asInstanceOf[String]
Seq(
"com.google.guava" % "guava" % guavaVersion,
"com.google.protobuf" % "protobuf-java" % protoVersion % "protobuf"
Expand All @@ -861,7 +853,7 @@ object SparkConnectJdbc {
dependencyOverrides ++= {
val guavaVersion =
SbtPomKeys.effectivePom.value.getProperties.get(
"connect.guava.version").asInstanceOf[String]
"guava.version").asInstanceOf[String]
Seq(
"com.google.guava" % "guava" % guavaVersion,
"com.google.protobuf" % "protobuf-java" % protoVersion
Expand Down Expand Up @@ -889,14 +881,17 @@ object SparkConnectJdbc {
// Exclude `scala-library` from assembly.
(assembly / assemblyPackageScala / assembleArtifact) := false,

// Exclude `pmml-model-*.jar`, `scala-collection-compat_*.jar`,`jsr305-*.jar` and
// `netty-*.jar` and `unused-1.0.0.jar` from assembly.
// Exclude `pmml-model-*.jar`, `scala-collection-compat_*.jar`, `jsr305-*.jar`,
// `error_prone_annotations-*.jar`, `listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar`,
// `j2objc-annotations-*.jar`, `checker-qual-*.jar` and `unused-1.0.0.jar` from assembly.
(assembly / assemblyExcludedJars) := {
val cp = (assembly / fullClasspath).value
cp filter { v =>
val name = v.data.getName
name.startsWith("pmml-model-") || name.startsWith("scala-collection-compat_") ||
name.startsWith("jsr305-") || name == "unused-1.0.0.jar"
name.startsWith("jsr305-") || name.startsWith("error_prone_annotations") ||
name.startsWith("listenablefuture") || name.startsWith("j2objc-annotations") ||
name.startsWith("checker-qual") || name == "unused-1.0.0.jar"
}
},
// Only include `spark-connect-client-jdbc-*.jar`
Expand All @@ -913,7 +908,6 @@ object SparkConnectJdbc {
ShadeRule.rename("io.grpc.**" -> "org.sparkproject.connect.client.io.grpc.@1").inAll,
ShadeRule.rename("com.google.**" -> "org.sparkproject.connect.client.com.google.@1").inAll,
ShadeRule.rename("io.netty.**" -> "org.sparkproject.connect.client.io.netty.@1").inAll,
ShadeRule.rename("org.checkerframework.**" -> "org.sparkproject.connect.client.org.checkerframework.@1").inAll,
ShadeRule.rename("io.perfmark.**" -> "org.sparkproject.connect.client.io.perfmark.@1").inAll,
ShadeRule.rename("org.codehaus.**" -> "org.sparkproject.connect.client.org.codehaus.@1").inAll,
ShadeRule.rename("android.annotation.**" -> "org.sparkproject.connect.client.android.annotation.@1").inAll
Expand All @@ -940,7 +934,7 @@ object SparkConnectClient {
libraryDependencies ++= {
val guavaVersion =
SbtPomKeys.effectivePom.value.getProperties.get(
"connect.guava.version").asInstanceOf[String]
"guava.version").asInstanceOf[String]
Seq(
"com.google.guava" % "guava" % guavaVersion,
"com.google.protobuf" % "protobuf-java" % protoVersion % "protobuf"
Expand All @@ -949,7 +943,7 @@ object SparkConnectClient {
dependencyOverrides ++= {
val guavaVersion =
SbtPomKeys.effectivePom.value.getProperties.get(
"connect.guava.version").asInstanceOf[String]
"guava.version").asInstanceOf[String]
Seq(
"com.google.guava" % "guava" % guavaVersion,
"com.google.protobuf" % "protobuf-java" % protoVersion
Expand Down Expand Up @@ -977,22 +971,24 @@ object SparkConnectClient {
// Exclude `scala-library` from assembly.
(assembly / assemblyPackageScala / assembleArtifact) := false,

// Exclude `pmml-model-*.jar`, `scala-collection-compat_*.jar`,`jsr305-*.jar` and
// `netty-*.jar` and `unused-1.0.0.jar` from assembly.
// Exclude `pmml-model-*.jar`, `scala-collection-compat_*.jar`, `jsr305-*.jar`,
// `error_prone_annotations-*.jar`, `listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar`,
// `j2objc-annotations-*.jar`, `checker-qual-*.jar` and `unused-1.0.0.jar` from assembly.
(assembly / assemblyExcludedJars) := {
val cp = (assembly / fullClasspath).value
cp filter { v =>
val name = v.data.getName
name.startsWith("pmml-model-") || name.startsWith("scala-collection-compat_") ||
name.startsWith("jsr305-") || name == "unused-1.0.0.jar"
name.startsWith("jsr305-") || name.startsWith("error_prone_annotations") ||
name.startsWith("listenablefuture") || name.startsWith("j2objc-annotations") ||
name.startsWith("checker-qual") || name == "unused-1.0.0.jar"
}
},

(assembly / assemblyShadeRules) := Seq(
ShadeRule.rename("io.grpc.**" -> "org.sparkproject.connect.client.io.grpc.@1").inAll,
ShadeRule.rename("com.google.**" -> "org.sparkproject.connect.client.com.google.@1").inAll,
ShadeRule.rename("io.netty.**" -> "org.sparkproject.connect.client.io.netty.@1").inAll,
ShadeRule.rename("org.checkerframework.**" -> "org.sparkproject.connect.client.org.checkerframework.@1").inAll,
ShadeRule.rename("io.perfmark.**" -> "org.sparkproject.connect.client.io.perfmark.@1").inAll,
ShadeRule.rename("org.codehaus.**" -> "org.sparkproject.connect.client.org.codehaus.@1").inAll,
ShadeRule.rename("android.annotation.**" -> "org.sparkproject.connect.client.android.annotation.@1").inAll
Expand Down
6 changes: 1 addition & 5 deletions sql/connect/client/jdbc/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -77,7 +77,7 @@
<dependency>
<groupId>com.google.guava</groupId>
<artifactId>guava</artifactId>
<version>${connect.guava.version}</version>
<version>${guava.version}</version>
<scope>compile</scope>
</dependency>
<dependency>
Expand Down Expand Up @@ -177,10 +177,6 @@
<pattern>io.netty</pattern>
<shadedPattern>${spark.shade.packageName}.io.netty</shadedPattern>
</relocation>
<relocation>
<pattern>org.checkerframework</pattern>
<shadedPattern>${spark.shade.packageName}.org.checkerframework</shadedPattern>
</relocation>
<relocation>
<pattern>io.perfmark</pattern>
<shadedPattern>${spark.shade.packageName}.io.perfmark</shadedPattern>
Expand Down
9 changes: 0 additions & 9 deletions sql/connect/client/jvm/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -77,7 +77,6 @@
<dependency>
<groupId>com.google.guava</groupId>
<artifactId>guava</artifactId>
<version>${connect.guava.version}</version>
<scope>compile</scope>
</dependency>
<dependency>
Expand Down Expand Up @@ -178,18 +177,14 @@
<include>com.google.guava:*</include>
<include>com.google.android:*</include>
<include>com.google.api.grpc:*</include>
<include>com.google.code.findbugs:*</include>
<include>com.google.code.gson:*</include>
<include>com.google.errorprone:*</include>
<include>com.google.j2objc:*</include>
<include>com.google.protobuf:*</include>
<include>com.google.flatbuffers:*</include>
<include>io.grpc:*</include>
<include>io.netty:*</include>
<include>io.perfmark:*</include>
<include>org.apache.arrow:*</include>
<include>org.codehaus.mojo:*</include>
<include>org.checkerframework:*</include>
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Are you sure that checkerframework won't be used? Although it will be removed after Guava version 33.5.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

BTW, currently, in spark-network-common, it only shades the guava and failureaccess

<include>org.apache.spark:spark-connect-common_${scala.binary.version}</include>
<include>org.apache.spark:spark-sql-api_${scala.binary.version}</include>
</includes>
Expand Down Expand Up @@ -221,10 +216,6 @@
<pattern>io.netty</pattern>
<shadedPattern>${spark.shade.packageName}.io.netty</shadedPattern>
</relocation>
<relocation>
<pattern>org.checkerframework</pattern>
<shadedPattern>${spark.shade.packageName}.org.checkerframework</shadedPattern>
</relocation>
<relocation>
<pattern>io.perfmark</pattern>
<shadedPattern>${spark.shade.packageName}.io.perfmark</shadedPattern>
Expand Down
Loading