Skip to content

Commit e9a31a6

Browse files
authored
Release 0.2.7 (#93)
* less points * bump release version
1 parent 18fea7d commit e9a31a6

File tree

1 file changed

+5
-5
lines changed

1 file changed

+5
-5
lines changed

README.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -48,7 +48,7 @@ Additionally, if you want to run unit tests for python, you need the following d
4848
Assuming that `SPARK_HOME` is set, you can use PySpark like any other Spark package.
4949

5050
```bash
51-
$SPARK_HOME/bin/pyspark --packages databricks:tensorframes:0.2.6-rc1-s_2.11
51+
$SPARK_HOME/bin/pyspark --packages databricks:tensorframes:0.2.7-s_2.11
5252
```
5353

5454
Here is a small program that uses Tensorflow to add 3 to an existing column.
@@ -146,7 +146,7 @@ The scala support is a bit more limited than python. In scala, operations can be
146146
You simply use the published package:
147147

148148
```bash
149-
$SPARK_HOME/bin/spark-shell --packages databricks:tensorframes:0.2.6-rc1
149+
$SPARK_HOME/bin/spark-shell --packages databricks:tensorframes:0.2.7
150150
```
151151

152152
Here is the same program as before:
@@ -185,14 +185,14 @@ build/sbt distribution/spDist
185185
Assuming that SPARK_HOME is set and that you are in the root directory of the project:
186186

187187
```bash
188-
$SPARK_HOME/bin/spark-shell --jars $PWD/target/testing/scala-2.11/tensorframes-assembly-0.2.6-rc1.jar
188+
$SPARK_HOME/bin/spark-shell --jars $PWD/target/testing/scala-2.11/tensorframes-assembly-0.2.7.jar
189189
```
190190

191191
If you want to run the python version:
192192

193193
```bash
194-
PYTHONPATH=$PWD/target/testing/scala-2.11/tensorframes-assembly-0.2.6-rc1.jar \
195-
$SPARK_HOME/bin/pyspark --jars $PWD/target/testing/scala-2.11/tensorframes-assembly-0.2.6-rc1.jar
194+
PYTHONPATH=$PWD/target/testing/scala-2.11/tensorframes-assembly-0.2.7.jar \
195+
$SPARK_HOME/bin/pyspark --jars $PWD/target/testing/scala-2.11/tensorframes-assembly-0.2.7.jar
196196
```
197197

198198
## Acknowledgements

0 commit comments

Comments
 (0)