Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Spark] Upgrade latest Spark dependency to 3.5.3 #3647

Merged
merged 1 commit into from
Sep 24, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .github/workflows/spark_python_test.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -61,7 +61,7 @@ jobs:
# `-SNAPSHOT` in version (e.g. `3.3.0-SNAPSHOT`) as the version is picked up from
# the`version.sbt` file.
pipenv run pip install pip==24.0 setuptools==69.5.1 wheel==0.43.0
pipenv run pip install pyspark==3.5.0
pipenv run pip install pyspark==3.5.3
pipenv run pip install flake8==3.5.0 pypandoc==1.3.3
pipenv run pip install black==23.9.1
pipenv run pip install importlib_metadata==3.10.0
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/spark_test.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -65,7 +65,7 @@ jobs:
# `-SNAPSHOT` in version (e.g. `3.3.0-SNAPSHOT`) as the version is picked up from
# the`version.sbt` file.
pipenv run pip install pip==24.0 setuptools==69.5.1 wheel==0.43.0
pipenv run pip install pyspark==3.5.2
pipenv run pip install pyspark==3.5.3
pipenv run pip install flake8==3.5.0 pypandoc==1.3.3
pipenv run pip install black==23.9.1
pipenv run pip install importlib_metadata==3.10.0
Expand Down
2 changes: 1 addition & 1 deletion Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,7 @@ RUN pip3 install --upgrade pip
# the`version.sbt` file.
RUN pip install pip==24.0 setuptools==69.5.1 wheel==0.43.0

RUN pip3 install pyspark==3.5.2
RUN pip3 install pyspark==3.5.3

RUN pip3 install mypy==0.982

Expand Down
2 changes: 1 addition & 1 deletion benchmarks/build.sbt
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ scalaVersion := "2.12.18"
lazy val root = (project in file("."))
.settings(
name := "benchmarks",
libraryDependencies += "org.apache.spark" %% "spark-sql" % "3.5.2" % "provided",
libraryDependencies += "org.apache.spark" %% "spark-sql" % "3.5.3" % "provided",
libraryDependencies += "com.github.scopt" %% "scopt" % "4.0.1",
libraryDependencies += "com.fasterxml.jackson.module" %% "jackson-module-scala" % "2.13.1",

Expand Down
2 changes: 1 addition & 1 deletion build.sbt
Original file line number Diff line number Diff line change
Expand Up @@ -52,7 +52,7 @@ val all_scala_versions = Seq(scala212, scala213)
val default_scala_version = settingKey[String]("Default Scala version")
Global / default_scala_version := scala212

val LATEST_RELEASED_SPARK_VERSION = "3.5.2"
val LATEST_RELEASED_SPARK_VERSION = "3.5.3"
val SPARK_MASTER_VERSION = "4.0.0-SNAPSHOT"
val sparkVersion = settingKey[String]("Spark version")
spark / sparkVersion := getSparkVersion()
Expand Down
2 changes: 1 addition & 1 deletion docs/environment.yml
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@ dependencies:
- packaging==23.2
- py4j==0.10.9.7
- pygments==2.16.1
- pyspark==3.5.2
- pyspark==3.5.3
- pytz==2023.3.post1
- requests==2.31.0
- six==1.16.0
Expand Down
2 changes: 1 addition & 1 deletion examples/scala/build.sbt
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,7 @@ val lookupSparkVersion: PartialFunction[(Int, Int), String] = {
// version 4.0.0-preview1
case (major, minor) if major >= 4 => "4.0.0-preview1"
// versions 3.3.x+
case (major, minor) if major >= 3 && minor >=3 => "3.5.2"
case (major, minor) if major >= 3 && minor >=3 => "3.5.3"
// versions 3.0.0 to 3.2.x
case (major, minor) if major >= 3 && minor <=2 => "3.5.0"
// versions 2.4.x
Expand Down
2 changes: 1 addition & 1 deletion setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -65,7 +65,7 @@ def run(self):
'delta': ['py.typed'],
},
install_requires=[
'pyspark>=3.5.2,<3.6.0',
'pyspark>=3.5.3,<3.6.0',
'importlib_metadata>=1.0.0',
],
python_requires='>=3.6',
Expand Down
Loading