You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This error, which I produced using datadr::makeDisplay() on an hdfsConn ddo,
---------------------------------
There were R errors, showing 30:
Warning message:
Autokill is true and terminating job_1441994449703_0070
Error in .jcall("RJavaTools", "Ljava/lang/Object;", "invokeMethod", cl, :
java.io.FileNotFoundException: Cannot access /user/d3p423/tmp/tmp_output-6ae2a6d4f0966b3244a47a8f588f6c36: No such file or directory.
Calls: makeDisplay ... <Anonymous> -> .jrcall -> .jcall -> .jcheck -> .Call
In addition: Warning message:
In Rhipe:::rhwatch.runner(job = job, mon.sec = mon.sec, readback = readback, :
Job failure, deleting output: /user/d3p423/tmp/tmp_output-6ae2a6d4f0966b3244a47a8f588f6c36:
Execution halted
Warning message:
system call failed: Cannot allocate memory
is apparently a red-herring when there are R errors on the hadoop job--but the only error that is well described is the Java error not being able to access a file. Would sure be nice to have a sense of what the R errors are.
The text was updated successfully, but these errors were encountered:
This error, which I produced using datadr::makeDisplay() on an hdfsConn
ddo,
There were R errors, showing 30:
Warning message:
Autokill is true and terminating job_1441994449703_0070
Error in .jcall("RJavaTools", "Ljava/lang/Object;", "invokeMethod", cl, :
java.io.FileNotFoundException: Cannot access /user/d3p423/tmp/tmp_output-6ae2a6d4f0966b3244a47a8f588f6c36: No such file or directory.
Calls: makeDisplay ... -> .jrcall -> .jcall -> .jcheck -> .Call
In addition: Warning message:
In Rhipe:::rhwatch.runner(job = job, mon.sec = mon.sec, readback = readback, :
Job failure, deleting output: /user/d3p423/tmp/tmp_output-6ae2a6d4f0966b3244a47a8f588f6c36:
Execution halted
Warning message:
system call failed: Cannot allocate memory
is apparently a red-herring when there are R errors on the hadoop job--but
the only error that is well described is the Java error not being able to
access a file. Would sure be nice to have a sense of what the R errors are.
—
Reply to this email directly or view it on GitHub #32.
This error, which I produced using
datadr::makeDisplay()
on anhdfsConn ddo
,is apparently a red-herring when there are R errors on the hadoop job--but the only error that is well described is the Java error not being able to access a file. Would sure be nice to have a sense of what the R errors are.
The text was updated successfully, but these errors were encountered: