Skip to content

Commit 0f19ef3

Browse files
authored
Update spark version from 2.4.1 to 2.4.2 (#87)
1 parent 4e68ee6 commit 0f19ef3

File tree

5 files changed

+11
-6
lines changed

5 files changed

+11
-6
lines changed

docs/applications/implementations/aggregators.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -42,7 +42,7 @@ def aggregate_spark(data, columns, args):
4242
The following packages have been pre-installed and can be used in your implementations:
4343

4444
```text
45-
pyspark==2.4.1
45+
pyspark==2.4.2
4646
boto3==1.9.78
4747
msgpack==0.6.1
4848
numpy>=1.13.3,<2

docs/applications/implementations/transformers.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -86,7 +86,7 @@ def reverse_transform_python(transformed_value, args):
8686
The following packages have been pre-installed and can be used in your implementations:
8787

8888
```text
89-
pyspark==2.4.1
89+
pyspark==2.4.2
9090
boto3==1.9.78
9191
msgpack==0.6.1
9292
numpy>=1.13.3,<2

docs/applications/resources/environments.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -35,7 +35,7 @@ data:
3535
3636
#### CSV Config
3737
38-
To help ingest different styles of CSV files, Cortex supports the parameters listed below. All of these parameters are optional. A description and default values for each parameter can be found in the [PySpark CSV Documentation](https://spark.apache.org/docs/2.4.1/api/python/pyspark.sql.html#pyspark.sql.DataFrameReader.csv).
38+
To help ingest different styles of CSV files, Cortex supports the parameters listed below. All of these parameters are optional. A description and default values for each parameter can be found in the [PySpark CSV Documentation](https://spark.apache.org/docs/2.4.2/api/python/pyspark.sql.html#pyspark.sql.DataFrameReader.csv).
3939
4040
```yaml
4141
csv_config:

images/spark-base/Dockerfile

Lines changed: 7 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -11,8 +11,12 @@ RUN apt-get update -qq && apt-get install -y -q \
1111
RUN mkdir -p /opt
1212

1313
ARG HADOOP_VERSION="2.9.2"
14-
ARG SPARK_VERSION="2.4.1"
14+
ARG SPARK_VERSION="2.4.2"
1515
ARG TF_VERSION="1.12.0"
16+
# Required for building tensorflow spark connector
17+
ARG SCALA_VERSION="2.12"
18+
# Scalatest version from https://github.com/apache/spark/blob/v2.4.2/pom.xml
19+
ARG SCALATEST_VERSION="3.0.3"
1620
# Check aws-java-sdk-bundle dependency version: https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-aws/$HADOOP_VERSION
1721
ARG AWS_JAVA_SDK_VERSION="1.11.199"
1822

@@ -34,7 +38,8 @@ RUN rm -rf ~/tf-ecosystem && git clone https://github.com/tensorflow/ecosystem.g
3438
mvn -f ~/tf-ecosystem/hadoop/pom.xml versions:set -DnewVersion=${TF_VERSION} -q && \
3539
mvn -f ~/tf-ecosystem/hadoop/pom.xml -Dmaven.test.skip=true clean install -q && \
3640
mvn -f ~/tf-ecosystem/spark/spark-tensorflow-connector/pom.xml versions:set -DnewVersion=${TF_VERSION} -q && \
37-
mvn -f ~/tf-ecosystem/spark/spark-tensorflow-connector/pom.xml -Dmaven.test.skip=true clean install -Dspark.version=${SPARK_VERSION} -q && \
41+
mvn -f ~/tf-ecosystem/spark/spark-tensorflow-connector/pom.xml -Dmaven.test.skip=true clean install \
42+
-Dspark.version=${SPARK_VERSION} -Dscala.binary.version=${SCALA_VERSION} -Dscala.test.version=${SCALATEST_VERSION} -q && \
3843
mv ~/tf-ecosystem/spark/spark-tensorflow-connector/target/spark-tensorflow-connector_2.11-${TF_VERSION}.jar $SPARK_HOME/jars/
3944

4045
# Hadoop AWS

pkg/workloads/lib/package.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -39,7 +39,7 @@ def get_build_order(python_packages):
3939

4040

4141
def get_restricted_packages():
42-
req_list = ["pyspark==2.4.1", "tensorflow==1.12.0"]
42+
req_list = ["pyspark==2.4.2", "tensorflow==1.12.0"]
4343
req_files = glob.glob("/src/**/requirements.txt", recursive=True)
4444

4545
for req_file in req_files:

0 commit comments

Comments
 (0)