diff --git a/README.md b/README.md
index 7d7fc45f8..b6e17d188 100644
--- a/README.md
+++ b/README.md
@@ -74,13 +74,13 @@ You can link against this library in your program at the following coordinates:
-groupId: za.co.absa.cobrix artifactId: spark-cobol_2.11 version: 2.7.1
+groupId: za.co.absa.cobrix artifactId: spark-cobol_2.11 version: 2.7.2
|
-groupId: za.co.absa.cobrix artifactId: spark-cobol_2.12 version: 2.7.1
+groupId: za.co.absa.cobrix artifactId: spark-cobol_2.12 version: 2.7.2
|
-groupId: za.co.absa.cobrix artifactId: spark-cobol_2.13 version: 2.7.1
+groupId: za.co.absa.cobrix artifactId: spark-cobol_2.13 version: 2.7.2
|
@@ -91,17 +91,17 @@ This package can be added to Spark using the `--packages` command line option. F
### Spark compiled with Scala 2.11
```
-$SPARK_HOME/bin/spark-shell --packages za.co.absa.cobrix:spark-cobol_2.11:2.7.1
+$SPARK_HOME/bin/spark-shell --packages za.co.absa.cobrix:spark-cobol_2.11:2.7.2
```
### Spark compiled with Scala 2.12
```
-$SPARK_HOME/bin/spark-shell --packages za.co.absa.cobrix:spark-cobol_2.12:2.7.1
+$SPARK_HOME/bin/spark-shell --packages za.co.absa.cobrix:spark-cobol_2.12:2.7.2
```
### Spark compiled with Scala 2.13
```
-$SPARK_HOME/bin/spark-shell --packages za.co.absa.cobrix:spark-cobol_2.13:2.7.1
+$SPARK_HOME/bin/spark-shell --packages za.co.absa.cobrix:spark-cobol_2.13:2.7.2
```
## Usage
@@ -238,8 +238,8 @@ to decode various binary formats.
The jars that you need to get are:
-* spark-cobol_2.12-2.7.1.jar
-* cobol-parser_2.12-2.7.1.jar
+* spark-cobol_2.12-2.7.2.jar
+* cobol-parser_2.12-2.7.2.jar
* scodec-core_2.12-1.10.3.jar
* scodec-bits_2.12-1.1.4.jar
@@ -247,9 +247,9 @@ The jars that you need to get are:
After that you can specify these jars in `spark-shell` command line. Here is an example:
```
-$ spark-shell --packages za.co.absa.cobrix:spark-cobol_2.12:2.7.1
+$ spark-shell --packages za.co.absa.cobrix:spark-cobol_2.12:2.7.2
or
-$ spark-shell --master yarn --deploy-mode client --driver-cores 4 --driver-memory 4G --jars spark-cobol_2.12-2.7.1.jar,cobol-parser_2.12-2.7.1.jar,scodec-core_2.12-1.10.3.jar,scodec-bits_2.12-1.1.4.jar
+$ spark-shell --master yarn --deploy-mode client --driver-cores 4 --driver-memory 4G --jars spark-cobol_2.12-2.7.2.jar,cobol-parser_2.12-2.7.2.jar,scodec-core_2.12-1.10.3.jar,scodec-bits_2.12-1.1.4.jar
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
@@ -316,11 +316,11 @@ Creating an uber jar for Cobrix is very easy. Steps to build:
You can collect the uber jar of `spark-cobol` either at
`spark-cobol/target/scala-2.11/` or in `spark-cobol/target/scala-2.12/` depending on the Scala version you used.
-The fat jar will have '-bundle' suffix. You can also download pre-built bundles from https://github.com/AbsaOSS/cobrix/releases/tag/v2.7.1
+The fat jar will have '-bundle' suffix. You can also download pre-built bundles from https://github.com/AbsaOSS/cobrix/releases/tag/v2.7.2
Then, run `spark-shell` or `spark-submit` adding the fat jar as the option.
```sh
-$ spark-shell --jars spark-cobol_2.12_3.3-2.7.2-SNAPSHOT-bundle.jar
+$ spark-shell --jars spark-cobol_2.12_3.3-2.7.3-SNAPSHOT-bundle.jar
```
>