Skip to content

Commit

Permalink
Update readme for 0.10.0 release
Browse files Browse the repository at this point in the history
  • Loading branch information
srowen committed Aug 25, 2020
1 parent f28f1d2 commit 61d74ee
Showing 1 changed file with 6 additions and 6 deletions.
12 changes: 6 additions & 6 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -25,15 +25,15 @@ You can link against this library in your program at the following coordinates:
```
groupId: com.databricks
artifactId: spark-xml_2.11
version: 0.9.0
version: 0.10.0
```

### Scala 2.12

```
groupId: com.databricks
artifactId: spark-xml_2.12
version: 0.9.0
version: 0.10.0
```

## Using with Spark shell
Expand All @@ -42,12 +42,12 @@ This package can be added to Spark using the `--packages` command line option. F

### Spark compiled with Scala 2.11
```
$SPARK_HOME/bin/spark-shell --packages com.databricks:spark-xml_2.11:0.9.0
$SPARK_HOME/bin/spark-shell --packages com.databricks:spark-xml_2.11:0.10.0
```

### Spark compiled with Scala 2.12
```
$SPARK_HOME/bin/spark-shell --packages com.databricks:spark-xml_2.12:0.9.0
$SPARK_HOME/bin/spark-shell --packages com.databricks:spark-xml_2.12:0.10.0
```

## Features
Expand Down Expand Up @@ -400,7 +400,7 @@ Automatically infer schema (data types)
```R
library(SparkR)

sparkR.session("local[4]", sparkPackages = c("com.databricks:spark-xml_2.11:0.9.0"))
sparkR.session("local[4]", sparkPackages = c("com.databricks:spark-xml_2.11:0.10.0"))

df <- read.df("books.xml", source = "xml", rowTag = "book")

Expand All @@ -412,7 +412,7 @@ You can manually specify schema:
```R
library(SparkR)

sparkR.session("local[4]", sparkPackages = c("com.databricks:spark-xml_2.11:0.9.0"))
sparkR.session("local[4]", sparkPackages = c("com.databricks:spark-xml_2.11:0.10.0"))
customSchema <- structType(
structField("_id", "string"),
structField("author", "string"),
Expand Down

0 comments on commit 61d74ee

Please sign in to comment.