Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Updated version of some tools in main script, since some mirrors have removed old versions #21

Open
wants to merge 1 commit into
base: master
Choose a base branch
from

Conversation

mmozum
Copy link

@mmozum mmozum commented Nov 4, 2016

The setup fails for some tools, needed this minimal change to get it going.

@yahoocla
Copy link

yahoocla commented Nov 4, 2016

Thank you for submitting this pull request, however I do not see a valid CLA on file for you. Before we can merge this request please visit https://yahoocla.herokuapp.com/ and agree to the terms. Thanks! 😄

@yahoocla
Copy link

yahoocla commented Nov 4, 2016

CLA is valid!

@HassebJ
Copy link
Contributor

HassebJ commented Nov 18, 2016

Changing SCALA_BIN_VERSION to 2.11 from 2.10 causes the code to break and consequently the benchmark will not execute propoerly even on single node. The error is presumably because of of Kafka, my guess is that bumping its version as well might fix the issue.
Attaching stack trace for SPARK benchmark executed on a single machine for reference, similar for STROM and FLINK as well.

16/11/17 22:27:37 INFO SparkDeploySchedulerBackend: Granted executor ID app-20161117222737-0000/0 on hostPort 192.168.140.52:59604 with 8 cores, 1024.0 MB RAM
16/11/17 22:27:37 INFO AppClient$ClientEndpoint: Executor updated: app-20161117222737-0000/0 is now RUNNING
16/11/17 22:27:37 INFO SparkDeploySchedulerBackend: SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.0
Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.ArrowAssoc(Ljava/lang/Object;)Ljava/lang/Object;
    at spark.benchmark.KafkaRedisAdvertisingStream$.main(AdvertisingSpark.scala:64)
    at spark.benchmark.KafkaRedisAdvertisingStream.main(AdvertisingSpark.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:601)
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
16/11/17 22:27:37 INFO SparkContext: Invoking stop() from shutdown hook
16/11/17 22:27:37 INFO SparkUI: Stopped Spark web UI at http://192.168.140.52:4040
16/11/17 22:27:37 INFO SparkDeploySchedulerBackend: Shutting down all executors
16/11/17 22:27:37 INFO SparkDeploySchedulerBackend: Asking each executor to shut down
16/11/17 22:27:37 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
16/11/17 22:27:37 INFO MemoryStore: MemoryStore cleared
16/11/17 22:27:37 INFO BlockManager: BlockManager stopped
16/11/17 22:27:37 INFO BlockManagerMaster: BlockManagerMaster stopped
16/11/17 22:27:37 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
16/11/17 22:27:37 INFO RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon.
16/11/17 22:27:37 INFO RemoteActorRefProvider$RemotingTerminator: Remote daemon shut down; proceeding with flushing remote transports.
16/11/17 22:27:37 INFO SparkContext: Successfully stopped SparkContext
16/11/17 22:27:37 INFO ShutdownHookManager: Shutdown hook called
16/11/17 22:27:37 INFO ShutdownHookManager: Deleting directory /tmp/spark-569e7add-4144-4589-83a9-b1eb130bfbc9
16/11/17 22:27:37 INFO ShutdownHookManager: Deleting directory /tmp/spark-569e7add-4144-4589-83a9-b1eb130bfbc9/httpd-c323c571-2188-430b-9ae7-6365e1dba32e

@ivanliu
Copy link

ivanliu commented Nov 22, 2016

I had same issue, this change did work, please merge it.

@revans2
Copy link
Collaborator

revans2 commented Nov 22, 2016

When I run the tests with this patch I still see an empty seen.txt on spark. @ivanliu what platform are you running on that has this all work?

@ivanliu
Copy link

ivanliu commented Nov 22, 2016

@revans2 I'm running the script on Mac OS (10.11.2).

Without this patch, some components cannot get installed with stream-bench.sh SETUP. I tried STORM_TEST, it seemed to work, but SPARK_TEST gave me empty seen.txt.

@revans2
Copy link
Collaborator

revans2 commented Nov 22, 2016

@ivanliu that is the real problem we can run the benchmark but it does not really do anything and we are still trying to debug why that is.

@HassebJ
Copy link
Contributor

HassebJ commented Nov 22, 2016

@revans2 @ivanliu Kindly check out this PR. Tested to work on Linux and issue of empty updated.txt and seen.txt is also fixed.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants