Skip to content

Commit 99e6d88

Browse files
authored
Add back tensorflow source code, update tensorframes version to RC3 (#126)
* Shade google.protobuf classes in addition to com.google.protobuf classes * Update version number to RC2 * Revert deletion of tensorflow source files, update version to 0.2.9-rc3
1 parent 9c94d99 commit 99e6d88

File tree

79 files changed

+45959
-9
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

79 files changed

+45959
-9
lines changed

README.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -54,7 +54,7 @@ Additionally, if you want to run unit tests for python, you need the following d
5454
Assuming that `SPARK_HOME` is set, you can use PySpark like any other Spark package.
5555

5656
```bash
57-
$SPARK_HOME/bin/pyspark --packages databricks:tensorframes:0.2.9-rc2-s_2.11
57+
$SPARK_HOME/bin/pyspark --packages databricks:tensorframes:0.2.9-rc3-s_2.11
5858
```
5959

6060
Here is a small program that uses Tensorflow to add 3 to an existing column.
@@ -152,7 +152,7 @@ The scala support is a bit more limited than python. In scala, operations can be
152152
You simply use the published package:
153153

154154
```bash
155-
$SPARK_HOME/bin/spark-shell --packages databricks:tensorframes:0.2.9-rc2
155+
$SPARK_HOME/bin/spark-shell --packages databricks:tensorframes:0.2.9-rc3
156156
```
157157

158158
Here is the same program as before:
@@ -202,14 +202,14 @@ build/sbt distribution/spDist
202202
Assuming that SPARK_HOME is set and that you are in the root directory of the project:
203203

204204
```bash
205-
$SPARK_HOME/bin/spark-shell --jars $PWD/target/testing/scala-2.11/tensorframes-assembly-0.2.9-rc2.jar
205+
$SPARK_HOME/bin/spark-shell --jars $PWD/target/testing/scala-2.11/tensorframes-assembly-0.2.9-rc3.jar
206206
```
207207

208208
If you want to run the python version:
209209

210210
```bash
211-
PYTHONPATH=$PWD/target/testing/scala-2.11/tensorframes-assembly-0.2.9-rc2.jar \
212-
$SPARK_HOME/bin/pyspark --jars $PWD/target/testing/scala-2.11/tensorframes-assembly-0.2.9-rc2.jar
211+
PYTHONPATH=$PWD/target/testing/scala-2.11/tensorframes-assembly-0.2.9-rc3.jar \
212+
$SPARK_HOME/bin/pyspark --jars $PWD/target/testing/scala-2.11/tensorframes-assembly-0.2.9-rc3.jar
213213
```
214214

215215
## Acknowledgements

project/Build.scala

Lines changed: 2 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,7 @@ object Shading extends Build {
1111

1212

1313
lazy val commonSettings = Seq(
14-
version := "0.2.9-rc2",
14+
version := "0.2.9-rc3",
1515
name := "tensorframes",
1616
scalaVersion := sys.props.getOrElse("scala.version", "2.11.8"),
1717
organization := "databricks",
@@ -51,9 +51,7 @@ object Shading extends Build {
5151
"com.typesafe.scala-logging" %% "scala-logging-api" % "2.1.2",
5252
"com.typesafe.scala-logging" %% "scala-logging-slf4j" % "2.1.2",
5353
// TensorFlow dependencies
54-
"org.tensorflow" % "tensorflow" % targetTensorFlowVersion,
55-
"org.tensorflow" % "proto" % targetTensorFlowVersion,
56-
"org.tensorflow" % "libtensorflow" % targetTensorFlowVersion
54+
"org.tensorflow" % "tensorflow" % targetTensorFlowVersion
5755
)
5856

5957
lazy val testDependencies = Seq(

0 commit comments

Comments
 (0)