-
Notifications
You must be signed in to change notification settings - Fork 78
java.lang.OutOfMemoryError: GC overhead limit exceeded #115
Description
I use:
spark1.6.0
spark-rabbitmq 0.4.0
My program ran for 24 hours before it was reported wrong.Please help me.
My error log:
`17/09/13 17:15:43 WARN spark.HeartbeatReceiver: Removing executor 6 with no recent heartbeats: 178850 ms exceeds timeout 120000 ms
17/09/13 17:15:43 ERROR cluster.YarnScheduler: Lost executor 6 on spark1: Executor heartbeat timed out after 178850 ms
17/09/13 17:15:43 WARN scheduler.TaskSetManager: Lost task 0.0 in stage 7.0 (TID 75, spark1): ExecutorLostFailure (executor 6 exited caused by one of the running tasks) Reason: Executor heartbeat timed out after 178850 ms
17/09/13 17:15:43 INFO scheduler.DAGScheduler: Executor lost: 6 (epoch 544)
17/09/13 17:15:43 INFO storage.BlockManagerMasterEndpoint: Trying to remove executor 6 from BlockManagerMaster.
17/09/13 17:15:45 INFO cluster.YarnClientSchedulerBackend: Requesting to kill executor(s) 6
17/09/13 17:15:52 WARN server.TransportChannelHandler: Exception in connection from spark2/192.168.155.3:50701
java.lang.OutOfMemoryError: GC overhead limit exceeded
17/09/13 17:15:52 WARN server.TransportChannelHandler: Exception in connection from spark3/192.168.155.4:37321
java.lang.OutOfMemoryError: GC overhead limit exceeded
17/09/13 17:15:52 WARN server.TransportChannelHandler: Exception in connection from spark3/192.168.155.4:37317
java.lang.OutOfMemoryError: GC overhead limit exceeded
17/09/13 17:15:52 WARN server.TransportChannelHandler: Exception in connection from spark2/192.168.155.3:57252
java.lang.OutOfMemoryError: GC overhead limit exceeded
17/09/13 17:15:52 INFO storage.BlockManagerMasterEndpoint: Removing block manager BlockManagerId(6, spark1, 54732)
17/09/13 17:15:52 INFO storage.BlockManagerMaster: Removed 6 successfully in removeExecutor
17/09/13 17:15:52 INFO storage.BlockManagerInfo: Added input-2-1505294133600 in memory on spark2:53361 (size: 9.1 KB, free: 1047.9 MB)
17/09/13 17:15:52 ERROR server.TransportRequestHandler: Error sending result RpcResponse{requestId=6104341188587013298, body=NioManagedBuffer{buf=java.nio.HeapByteBuffer[pos=0 lim=353 cap=353]}} to spark3/192.168.155.4:37317; closing connection
java.nio.channels.ClosedChannelException
17/09/13 17:15:52 INFO storage.BlockManagerInfo: Added input-3-1505294135000 in memory on spark3:34916 (size: 1568.0 B, free: 819.1 MB)
17/09/13 17:15:52 ERROR server.TransportRequestHandler: Error sending result RpcResponse{requestId=7579983330511639701, body=NioManagedBuffer{buf=java.nio.HeapByteBuffer[pos=0 lim=47 cap=47]}} to spark2/192.168.155.3:57252; closing connection
java.nio.channels.ClosedChannelException
17/09/13 17:15:52 ERROR server.TransportRequestHandler: Error sending result RpcResponse{requestId=5632111837789001813, body=NioManagedBuffer{buf=java.nio.HeapByteBuffer[pos=0 lim=47 cap=47]}} to spark3/192.168.155.4:37321; closing connection
java.nio.channels.ClosedChannelException
17/09/13 17:15:56 WARN server.TransportChannelHandler: Exception in connection from spark1/192.168.155.2:41920
java.lang.OutOfMemoryError: GC overhead limit exceeded
17/09/13 17:15:56 ERROR cluster.YarnClientSchedulerBackend: Yarn application has already exited with state FINISHED!
17/09/13 17:15:56 INFO ui.SparkUI: Stopped Spark web UI at http://192.168.155.2:4040
17/09/13 17:16:00 WARN server.TransportChannelHandler: Exception in connection from spark2/192.168.155.3:57212
java.lang.OutOfMemoryError: GC overhead limit exceeded
17/09/13 17:16:00 ERROR scheduler.ReceiverTracker: Receiver has been stopped. Try to restart it.
org.apache.spark.SparkException: Job 4 cancelled because SparkContext was shut down
at org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:806)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:804)
at scala.collection.mutable.HashSet.foreach(HashSet.scala:79)
at org.apache.spark.scheduler.DAGScheduler.cleanUpAfterSchedulerStop(DAGScheduler.scala:804)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onStop(DAGScheduler.scala:1658)
at org.apache.spark.util.EventLoop.stop(EventLoop.scala:84)
at org.apache.spark.scheduler.DAGScheduler.stop(DAGScheduler.scala:1581)
at org.apache.spark.SparkContext$$anonfun$stop$9.apply$mcV$sp(SparkContext.scala:1751)
at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1230)
at org.apache.spark.SparkContext.stop(SparkContext.scala:1750)
at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend$MonitorThread.run(YarnClientSchedulerBackend.scala:147)
17/09/13 17:16:00 INFO scheduler.DAGScheduler: ResultStage 1508 (start at SparkStremingReadRabbitMQ.scala:77) failed in 6207.798 s
17/09/13 17:16:00 ERROR scheduler.LiveListenerBus: SparkListenerBus has already stopped! Dropping event SparkListenerStageCompleted(org.apache.spark.scheduler.StageInfo@1ab59f28)
17/09/13 17:16:00 INFO scheduler.DAGScheduler: ResultStage 5 (start at SparkStremingReadRabbitMQ.scala:77) failed in 83400.231 s
17/09/13 17:16:00 ERROR scheduler.LiveListenerBus: SparkListenerBus has already stopped! Dropping event SparkListenerStageCompleted(org.apache.spark.scheduler.StageInfo@772c8704)
17/09/13 17:16:00 INFO scheduler.ReceiverTracker: Restarting Receiver 3
17/09/13 17:16:00 INFO scheduler.DAGScheduler: ResultStage 2 (start at SparkStremingReadRabbitMQ.scala:77) failed in 83400.642 s
17/09/13 17:16:00 ERROR scheduler.LiveListenerBus: SparkListenerBus has already stopped! Dropping event SparkListenerStageCompleted(org.apache.spark.scheduler.StageInfo@4b86d4b2)
17/09/13 17:16:00 INFO scheduler.DAGScheduler: ShuffleMapStage 1599 (text at SparkStremingReadRabbitMQ.scala:72) failed in 1592.898 s
17/09/13 17:16:00 ERROR scheduler.LiveListenerBus: SparkListenerBus has already stopped! Dropping event SparkListenerStageCompleted(org.apache.spark.scheduler.StageInfo@1ae3ac4)
17/09/13 17:16:00 INFO scheduler.DAGScheduler: ResultStage 6 (start at SparkStremingReadRabbitMQ.scala:77) failed in 83400.065 s
17/09/13 17:16:00 ERROR scheduler.LiveListenerBus: SparkListenerBus has already stopped! Dropping event SparkListenerStageCompleted(org.apache.spark.scheduler.StageInfo@37d4a906)
17/09/13 17:16:00 INFO scheduler.DAGScheduler: ResultStage 7 (start at SparkStremingReadRabbitMQ.scala:77) failed in 83399.925 s
17/09/13 17:16:00 ERROR scheduler.LiveListenerBus: SparkListenerBus has already stopped! Dropping event SparkListenerStageCompleted(org.apache.spark.scheduler.StageInfo@24531c5f)
17/09/13 17:16:00 INFO scheduler.DAGScheduler: ResultStage 4 (start at SparkStremingReadRabbitMQ.scala:77) failed in 83400.364 s
17/09/13 17:16:00 ERROR scheduler.LiveListenerBus: SparkListenerBus has already stopped! Dropping event SparkListenerStageCompleted(org.apache.spark.scheduler.StageInfo@590d6f23)
17/09/13 17:16:00 INFO scheduler.DAGScheduler: ResultStage 1601 (start at SparkStremingReadRabbitMQ.scala:77) failed in 698.533 s
17/09/13 17:16:00 ERROR scheduler.LiveListenerBus: SparkListenerBus has already stopped! Dropping event SparkListenerStageCompleted(org.apache.spark.scheduler.StageInfo@290516dc)
17/09/13 17:16:00 ERROR scheduler.LiveListenerBus: SparkListenerBus has already stopped! Dropping event SparkListenerJobEnd(4,1505294160255,JobFailed(org.apache.spark.SparkException: Job 4 cancelled because SparkContext was shut down))
17/09/13 17:16:00 ERROR scheduler.LiveListenerBus: SparkListenerBus has already stopped! Dropping event SparkListenerJobEnd(6,1505294160255,JobFailed(org.apache.spark.SparkException: Job 6 cancelled because SparkContext was shut down))
17/09/13 17:16:00 ERROR scheduler.ReceiverTracker: Receiver has been stopped. Try to restart it.
org.apache.spark.SparkException: Job 6 cancelled because SparkContext was shut down
at org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:806)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:804)
at scala.collection.mutable.HashSet.foreach(HashSet.scala:79)
at org.apache.spark.scheduler.DAGScheduler.cleanUpAfterSchedulerStop(DAGScheduler.scala:804)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onStop(DAGScheduler.scala:1658)
at org.apache.spark.util.EventLoop.stop(EventLoop.scala:84)
at org.apache.spark.scheduler.DAGScheduler.stop(DAGScheduler.scala:1581)
at org.apache.spark.SparkContext$$anonfun$stop$9.apply$mcV$sp(SparkContext.scala:1751)
at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1230)
at org.apache.spark.SparkContext.stop(SparkContext.scala:1750)
at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend$MonitorThread.run(YarnClientSchedulerBackend.scala:147)
17/09/13 17:16:00 INFO scheduler.ReceiverTracker: Restarting Receiver 5
17/09/13 17:16:00 ERROR scheduler.ReceiverTracker: Receiver has been stopped. Try to restart it.
org.apache.spark.SparkException: Job 995 cancelled because SparkContext was shut down
at org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:806)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:804)
at scala.collection.mutable.HashSet.foreach(HashSet.scala:79)
at org.apache.spark.scheduler.DAGScheduler.cleanUpAfterSchedulerStop(DAGScheduler.scala:804)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onStop(DAGScheduler.scala:1658)
at org.apache.spark.util.EventLoop.stop(EventLoop.scala:84)
at org.apache.spark.scheduler.DAGScheduler.stop(DAGScheduler.scala:1581)
at org.apache.spark.SparkContext$$anonfun$stop$9.apply$mcV$sp(SparkContext.scala:1751)
at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1230)
at org.apache.spark.SparkContext.stop(SparkContext.scala:1750)
at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend$MonitorThread.run(YarnClientSchedulerBackend.scala:147)
17/09/13 17:16:00 INFO scheduler.ReceiverTracker: Restarting Receiver 6
17/09/13 17:16:00 ERROR scheduler.LiveListenerBus: SparkListenerBus has already stopped! Dropping event SparkListenerJobEnd(995,1505294160257,JobFailed(org.apache.spark.SparkException: Job 995 cancelled because SparkContext was shut down))
17/09/13 17:16:00 ERROR netty.Inbox: Ignoring error
java.lang.IllegalStateException: Cannot call methods on a stopped SparkContext.
This stopped SparkContext was created at:
org.apache.spark.SparkContext.(SparkContext.scala:83)
org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:874)
org.apache.spark.streaming.StreamingContext.(StreamingContext.scala:81)
com.MainProject.DongHuan.SparkStremingReadRabbitMQ$.main(SparkStremingReadRabbitMQ.scala:20)
com.MainProject.DongHuan.SparkStremingReadRabbitMQ.main(SparkStremingReadRabbitMQ.scala)
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
java.lang.reflect.Method.invoke(Method.java:498)
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
The currently active SparkContext was created at:
org.apache.spark.SparkContext.(SparkContext.scala:83)
org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:874)
org.apache.spark.streaming.StreamingContext.(StreamingContext.scala:81)
com.MainProject.DongHuan.SparkStremingReadRabbitMQ$.main(SparkStremingReadRabbitMQ.scala:20)
com.MainProject.DongHuan.SparkStremingReadRabbitMQ.main(SparkStremingReadRabbitMQ.scala)
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
java.lang.reflect.Method.invoke(Method.java:498)
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
at org.apache.spark.SparkContext.org$apache$spark$SparkContext$$assertNotStopped(SparkContext.scala:107)
at org.apache.spark.SparkContext$$anonfun$makeRDD$2.apply(SparkContext.scala:830)
at org.apache.spark.SparkContext$$anonfun$makeRDD$2.apply(SparkContext.scala:829)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:111)
at org.apache.spark.SparkContext.withScope(SparkContext.scala:725)
at org.apache.spark.SparkContext.makeRDD(SparkContext.scala:829)
at org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverTrackerEndpoint.org$apache$spark$streaming$scheduler$ReceiverTracker$ReceiverTrackerEndpoint$$startReceiver(ReceiverTracker.scala:588)
at org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverTrackerEndpoint$$anonfun$receive$1.applyOrElse(ReceiverTracker.scala:477)
at org.apache.spark.rpc.netty.Inbox$$anonfun$process$1.apply$mcV$sp(Inbox.scala:116)
at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:204)
at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100)
at org.apache.spark.rpc.netty.Dispatcher$MessageLoop.run(Dispatcher.scala:215)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
17/09/13 17:16:00 ERROR scheduler.ReceiverTracker: Receiver has been stopped. Try to restart it.
org.apache.spark.SparkException: Job 3 cancelled because SparkContext was shut down
at org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:806)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:804)
at scala.collection.mutable.HashSet.foreach(HashSet.scala:79)
at org.apache.spark.scheduler.DAGScheduler.cleanUpAfterSchedulerStop(DAGScheduler.scala:804)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onStop(DAGScheduler.scala:1658)
at org.apache.spark.util.EventLoop.stop(EventLoop.scala:84)
at org.apache.spark.scheduler.DAGScheduler.stop(DAGScheduler.scala:1581)
at org.apache.spark.SparkContext$$anonfun$stop$9.apply$mcV$sp(SparkContext.scala:1751)
at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1230)
at org.apache.spark.SparkContext.stop(SparkContext.scala:1750)
at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend$MonitorThread.run(YarnClientSchedulerBackend.scala:147)
17/09/13 17:16:00 INFO scheduler.ReceiverTracker: Restarting Receiver 2
17/09/13 17:16:00 ERROR scheduler.LiveListenerBus: SparkListenerBus has already stopped! Dropping event SparkListenerJobEnd(3,1505294160258,JobFailed(org.apache.spark.SparkException: Job 3 cancelled because SparkContext was shut down))
17/09/13 17:16:00 ERROR scheduler.ReceiverTracker: Receiver has been stopped. Try to restart it.
org.apache.spark.SparkException: Job 5 cancelled because SparkContext was shut down
at org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:806)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:804)
at scala.collection.mutable.HashSet.foreach(HashSet.scala:79)
at org.apache.spark.scheduler.DAGScheduler.cleanUpAfterSchedulerStop(DAGScheduler.scala:804)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onStop(DAGScheduler.scala:1658)
at org.apache.spark.util.EventLoop.stop(EventLoop.scala:84)
at org.apache.spark.scheduler.DAGScheduler.stop(DAGScheduler.scala:1581)
at org.apache.spark.SparkContext$$anonfun$stop$9.apply$mcV$sp(SparkContext.scala:1751)
at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1230)
at org.apache.spark.SparkContext.stop(SparkContext.scala:1750)
at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend$MonitorThread.run(YarnClientSchedulerBackend.scala:147)
17/09/13 17:16:00 INFO scheduler.ReceiverTracker: Restarting Receiver 4
17/09/13 17:16:00 ERROR scheduler.LiveListenerBus: SparkListenerBus has already stopped! Dropping event SparkListenerJobEnd(5,1505294160261,JobFailed(org.apache.spark.SparkException: Job 5 cancelled because SparkContext was shut down))
17/09/13 17:16:00 ERROR netty.Inbox: Ignoring error
java.lang.IllegalStateException: Cannot call methods on a stopped SparkContext.
This stopped SparkContext was created at:
org.apache.spark.SparkContext.(SparkContext.scala:83)
org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:874)
org.apache.spark.streaming.StreamingContext.(StreamingContext.scala:81)
com.MainProject.DongHuan.SparkStremingReadRabbitMQ$.main(SparkStremingReadRabbitMQ.scala:20)
com.MainProject.DongHuan.SparkStremingReadRabbitMQ.main(SparkStremingReadRabbitMQ.scala)
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
java.lang.reflect.Method.invoke(Method.java:498)
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
The currently active SparkContext was created at:
org.apache.spark.SparkContext.(SparkContext.scala:83)
org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:874)
org.apache.spark.streaming.StreamingContext.(StreamingContext.scala:81)
com.MainProject.DongHuan.SparkStremingReadRabbitMQ$.main(SparkStremingReadRabbitMQ.scala:20)
com.MainProject.DongHuan.SparkStremingReadRabbitMQ.main(SparkStremingReadRabbitMQ.scala)
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
java.lang.reflect.Method.invoke(Method.java:498)
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
at org.apache.spark.SparkContext.org$apache$spark$SparkContext$$assertNotStopped(SparkContext.scala:107)
at org.apache.spark.SparkContext$$anonfun$makeRDD$2.apply(SparkContext.scala:830)
at org.apache.spark.SparkContext$$anonfun$makeRDD$2.apply(SparkContext.scala:829)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:111)
at org.apache.spark.SparkContext.withScope(SparkContext.scala:725)
at org.apache.spark.SparkContext.makeRDD(SparkContext.scala:829)
at org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverTrackerEndpoint.org$apache$spark$streaming$scheduler$ReceiverTracker$ReceiverTrackerEndpoint$$startReceiver(ReceiverTracker.scala:588)
at org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverTrackerEndpoint$$anonfun$receive$1.applyOrElse(ReceiverTracker.scala:477)
at org.apache.spark.rpc.netty.Inbox$$anonfun$process$1.apply$mcV$sp(Inbox.scala:116)
at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:204)
at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100)
at org.apache.spark.rpc.netty.Dispatcher$MessageLoop.run(Dispatcher.scala:215)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
17/09/13 17:16:00 ERROR netty.Inbox: Ignoring error
java.lang.IllegalStateException: Cannot call methods on a stopped SparkContext.
This stopped SparkContext was created at:
org.apache.spark.SparkContext.(SparkContext.scala:83)
org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:874)
org.apache.spark.streaming.StreamingContext.(StreamingContext.scala:81)
com.MainProject.DongHuan.SparkStremingReadRabbitMQ$.main(SparkStremingReadRabbitMQ.scala:20)
com.MainProject.DongHuan.SparkStremingReadRabbitMQ.main(SparkStremingReadRabbitMQ.scala)
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
java.lang.reflect.Method.invoke(Method.java:498)
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
The currently active SparkContext was created at:
org.apache.spark.SparkContext.(SparkContext.scala:83)
org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:874)
org.apache.spark.streaming.StreamingContext.(StreamingContext.scala:81)
com.MainProject.DongHuan.SparkStremingReadRabbitMQ$.main(SparkStremingReadRabbitMQ.scala:20)
com.MainProject.DongHuan.SparkStremingReadRabbitMQ.main(SparkStremingReadRabbitMQ.scala)
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
java.lang.reflect.Method.invoke(Method.java:498)
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
at org.apache.spark.SparkContext.org$apache$spark$SparkContext$$assertNotStopped(SparkContext.scala:107)
at org.apache.spark.SparkContext$$anonfun$makeRDD$2.apply(SparkContext.scala:830)
at org.apache.spark.SparkContext$$anonfun$makeRDD$2.apply(SparkContext.scala:829)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:111)
at org.apache.spark.SparkContext.withScope(SparkContext.scala:725)
at org.apache.spark.SparkContext.makeRDD(SparkContext.scala:829)
at org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverTrackerEndpoint.org$apache$spark$streaming$scheduler$ReceiverTracker$ReceiverTrackerEndpoint$$startReceiver(ReceiverTracker.scala:588)
at org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverTrackerEndpoint$$anonfun$receive$1.applyOrElse(ReceiverTracker.scala:477)
at org.apache.spark.rpc.netty.Inbox$$anonfun$process$1.apply$mcV$sp(Inbox.scala:116)
at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:204)
at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100)
at org.apache.spark.rpc.netty.Dispatcher$MessageLoop.run(Dispatcher.scala:215)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
17/09/13 17:16:00 ERROR netty.Inbox: Ignoring error
java.lang.IllegalStateException: Cannot call methods on a stopped SparkContext.
This stopped SparkContext was created at:
org.apache.spark.SparkContext.(SparkContext.scala:83)
org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:874)
org.apache.spark.streaming.StreamingContext.(StreamingContext.scala:81)
com.MainProject.DongHuan.SparkStremingReadRabbitMQ$.main(SparkStremingReadRabbitMQ.scala:20)
com.MainProject.DongHuan.SparkStremingReadRabbitMQ.main(SparkStremingReadRabbitMQ.scala)
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
java.lang.reflect.Method.invoke(Method.java:498)
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
The currently active SparkContext was created at:
org.apache.spark.SparkContext.(SparkContext.scala:83)
org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:874)
org.apache.spark.streaming.StreamingContext.(StreamingContext.scala:81)
com.MainProject.DongHuan.SparkStremingReadRabbitMQ$.main(SparkStremingReadRabbitMQ.scala:20)
com.MainProject.DongHuan.SparkStremingReadRabbitMQ.main(SparkStremingReadRabbitMQ.scala)
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
java.lang.reflect.Method.invoke(Method.java:498)
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
at org.apache.spark.SparkContext.org$apache$spark$SparkContext$$assertNotStopped(SparkContext.scala:107)
at org.apache.spark.SparkContext$$anonfun$makeRDD$2.apply(SparkContext.scala:830)
at org.apache.spark.SparkContext$$anonfun$makeRDD$2.apply(SparkContext.scala:829)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:111)
at org.apache.spark.SparkContext.withScope(SparkContext.scala:725)
at org.apache.spark.SparkContext.makeRDD(SparkContext.scala:829)
at org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverTrackerEndpoint.org$apache$spark$streaming$scheduler$ReceiverTracker$ReceiverTrackerEndpoint$$startReceiver(ReceiverTracker.scala:588)
at org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverTrackerEndpoint$$anonfun$receive$1.applyOrElse(ReceiverTracker.scala:477)
at org.apache.spark.rpc.netty.Inbox$$anonfun$process$1.apply$mcV$sp(Inbox.scala:116)
at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:204)
at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100)
at org.apache.spark.rpc.netty.Dispatcher$MessageLoop.run(Dispatcher.scala:215)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
17/09/13 17:16:00 INFO scheduler.DAGScheduler: Job 1055 failed: text at SparkStremingReadRabbitMQ.scala:72, took 1597.324995 s
17/09/13 17:16:00 ERROR netty.Inbox: Ignoring error
java.lang.IllegalStateException: Cannot call methods on a stopped SparkContext.
This stopped SparkContext was created at:
org.apache.spark.SparkContext.(SparkContext.scala:83)
org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:874)
org.apache.spark.streaming.StreamingContext.(StreamingContext.scala:81)
com.MainProject.DongHuan.SparkStremingReadRabbitMQ$.main(SparkStremingReadRabbitMQ.scala:20)
com.MainProject.DongHuan.SparkStremingReadRabbitMQ.main(SparkStremingReadRabbitMQ.scala)
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
java.lang.reflect.Method.invoke(Method.java:498)
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)`