Skip to content

#532 - Add code coverage support - Jacoco #533

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Nov 23, 2022
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
19 changes: 19 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -125,6 +125,15 @@ of that project for the detailed guide how to run the examples locally and on a
When running `mvn clean package` in `examples/spark-cobol-app` an uber jar will be created. It can be used to run
jobs via `spark-submit` or `spark-shell`.

## How to generate Code coverage report
```sbt
sbt ++{scala_version} jacoco
```
Code coverage will be generated on path:
```
{project-root}/cobrix/{module}/target/scala-{scala_version}/jacoco/report/html
```

### Reading Cobol binary files from HDFS/local and querying them

1. Create a Spark ```SQLContext```
Expand Down Expand Up @@ -1423,6 +1432,16 @@ For multisegment variable lengths tests:
![](performance/images/exp3_multiseg_wide_time.svg) ![](performance/images/exp3_multiseg_wide_efficiency.svg)
![](performance/images/exp3_multiseg_wide_records_throughput.svg) ![](performance/images/exp3_multiseg_wide_mb_throughput.svg)


### How to generate Code coverage report
```sbt
sbt jacoco
```
Code coverage will be generated on path:
```
{local-path}\fixed-width\target\scala-2.XY\jacoco\report\html
```

Comment on lines +1436 to +1444
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This might be extra

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, I was in doubt if it even needed to place in readme when I saw that file. Remove it?

## FAQ

This is a new section where we are going to post common questions and workarounds from GitHub issues raised by our users.
Expand Down
22 changes: 22 additions & 0 deletions build.sbt
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,7 @@
import Dependencies._
import BuildInfoTemplateSettings._
import ScalacOptions._
import com.github.sbt.jacoco.report.JacocoReportSettings

lazy val scala211 = "2.11.12"
lazy val scala212 = "2.12.17"
Expand All @@ -38,6 +39,15 @@ ThisBuild / autoScalaLibrary := false

lazy val printSparkVersion = taskKey[Unit]("Print Spark version spark-cobol is building against.")

lazy val commonJacocoReportSettings: JacocoReportSettings = JacocoReportSettings(
formats = Seq(JacocoReportFormats.HTML, JacocoReportFormats.XML)
)

lazy val commonJacocoExcludes: Seq[String] = Seq(
// "za.co.absa.cobrix.spark.cobol.reader.FixedLenTextReader*", // class and related objects
// "za.co.absa.cobrix.spark.cobol.reader.RowHandler" // class only
)

lazy val cobrix = (project in file("."))
.disablePlugins(sbtassembly.AssemblyPlugin)
.settings(
Expand All @@ -56,6 +66,10 @@ lazy val cobolParser = (project in file("cobol-parser"))
libraryDependencies ++= CobolParserDependencies :+ getScalaDependency(scalaVersion.value),
releasePublishArtifactsAction := PgpKeys.publishSigned.value,
assemblySettings
)
.settings(
jacocoReportSettings := commonJacocoReportSettings.withTitle("cobrix:cobol-parser Jacoco Report"),
jacocoExcludes := commonJacocoExcludes
).enablePlugins(AutomateHeaderPlugin)

lazy val cobolConverters = (project in file("cobol-converters"))
Expand All @@ -64,6 +78,10 @@ lazy val cobolConverters = (project in file("cobol-converters"))
name := "cobol-converters",
libraryDependencies ++= CobolConvertersDependencies :+ getScalaDependency(scalaVersion.value),
releasePublishArtifactsAction := PgpKeys.publishSigned.value
)
.settings(
jacocoReportSettings := commonJacocoReportSettings.withTitle("cobrix:cobol-converters Jacoco Report"),
jacocoExcludes := commonJacocoExcludes
).dependsOn(cobolParser)
.enablePlugins(AutomateHeaderPlugin)

Expand All @@ -83,6 +101,10 @@ lazy val sparkCobol = (project in file("spark-cobol"))
releasePublishArtifactsAction := PgpKeys.publishSigned.value,
assemblySettings
)
.settings(
jacocoReportSettings := commonJacocoReportSettings.withTitle("cobrix:spark-cobol Jacoco Report"),
jacocoExcludes := commonJacocoExcludes
)
.dependsOn(cobolParser)
.enablePlugins(AutomateHeaderPlugin)

Expand Down
1 change: 1 addition & 0 deletions project/plugins.sbt
Original file line number Diff line number Diff line change
Expand Up @@ -3,3 +3,4 @@ addSbtPlugin("com.jsuereth" % "sbt-pgp" % "2.0.0")
addSbtPlugin("de.heikoseeberger" % "sbt-header" % "5.2.0")
addSbtPlugin("com.github.sbt" % "sbt-release" % "1.1.0")
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.15.0")
addSbtPlugin("com.github.sbt" % "sbt-jacoco" % "3.4.0")