GitHub - apache/spark-docker: Official Dockerfile for Apache Spark (original) (raw)

Apache Spark Official Dockerfiles

What is Apache Spark?

Spark is a unified analytics engine for large-scale data processing. It provides high-level APIs in Scala, Java, Python, and R, and an optimized engine that supports general computation graphs for data analysis. It also supports a rich set of higher-level tools including Spark SQL for SQL and DataFrames, pandas API on Spark for pandas workloads, MLlib for machine learning, GraphX for graph processing, and Structured Streaming for stream processing.

https://spark.apache.org/

Create a new version

Step 1 Add dockerfiles for a new version.

You can see 3.4.0 PR as reference.

Step 2. Publish apache/spark Images.

Click Publish (Java 21 only), Publish (Java 17 only) (such as 4.x) or Publish (such as 3.x) to publish images.

After this, the apache/spark docker images will be published.

Step 3. Publish spark Docker Official Images.

Submit the PR to docker-library/official-images, see (link)[docker-library/official-images#15363] as reference.

You can type tools/manifest.py manifest to generate the content.

After this, the spark docker images will be published.

About images

Apache Spark Image Spark Docker Official Image
Name apache/spark spark
Maintenance Reviewed, published by Apache Spark community Reviewed, published and maintained by Docker community
Update policy Only build and push once when specific version release Actively rebuild for updates and security fixes
Link https://hub.docker.com/r/apache/spark https://hub.docker.com/_/spark
source apache/spark-docker apache/spark-docker and docker-library/official-images

We recommend using Spark Docker Official Image, the Apache Spark Image are provided in case of delays in the review process by Docker community.

About this repository

This repository contains the Dockerfiles used to build the Apache Spark Docker Image.

See more in SPARK-40513: SPIP: Support Docker Official Image for Spark.