How to shutdown/kill Spark Streaming application when one of the job fails -
i running spark streaming application. there few times 1 of jobs fails due runtime exception. spark marks job failed , continues process next streaming batch. there parameter can set notify spark kill application (not process next streaming batch) if 1 of jobs fails? using spark 1.4.1 on standalone cluster mode.
inside program, could:
- throw exception , bubble main, surrounding main body in try catch
- put
system.exit(0)
want. if shutdowngracefully property set true, close gracefully ever spark process , end program. here reference
Comments
Post a Comment