Skip to content

Commit 96d0fed

Browse files
elvaliuliuliuimback82
authored andcommitted
Typo fixes to README.md in deployment (#399)
1 parent e4cc338 commit 96d0fed

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

deployment/README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -177,7 +177,7 @@ Databricks allows you to submit Spark .NET apps to an existing active cluster or
177177
databricks fs cp db-init.sh dbfs:/spark-dotnet/db-init.sh
178178
databricks fs cp install-worker.sh dbfs:/spark-dotnet/install-worker.sh
179179
```
180-
6. Go to to your Databricks cluster homepage -> Clusters (on the left-side menu) -> Create Cluster
180+
6. Go to your Databricks cluster homepage -> Clusters (on the left-side menu) -> Create Cluster
181181
7. After configuring the cluster appropriately, set the init script (see the image below) and create the cluster.
182182
183183
<img src="../docs/img/deployment-databricks-init-script.PNG" alt="ScriptActionImage" width="600"/>
@@ -231,5 +231,5 @@ Publishing your App & Running:
231231
1. [Create a Job](https://docs.databricks.com/user-guide/jobs.html) and select *Configure spark-submit*.
232232
2. Configure `spark-submit` with the following parameters:
233233
```shell
234-
["--files","/dbfs/<path-to>/<app assembly/file to deploy to worker>","--class","org.apache.spark.deploy.dotnet.DotnetRunner","/dbfs/<path-to>/microsoft-spark-<spark_majorversion.spark_minorversion.x>-<spark_dotnet_version>.jar","/dbfs/<path-to>/<app name>.zip","<app bin name>","app arg1","app arg2"]
234+
["--files","/dbfs/<path-to>/<app assembly/file to deploy to worker>","--class","org.apache.spark.deploy.dotnet.DotnetRunner","/dbfs/<path-to>/microsoft-spark-<spark_majorversion.spark_minorversion.x>-<spark_dotnet_version>.jar","/dbfs/<path-to>/<app name>.zip","<app name>","app arg1","app arg2"]
235235
```

0 commit comments

Comments
 (0)