How to shutdown/kill Spark Streaming application when one of the job fails -


i running spark streaming application. there few times 1 of jobs fails due runtime exception. spark marks job failed , continues process next streaming batch. there parameter can set notify spark kill application (not process next streaming batch) if 1 of jobs fails? using spark 1.4.1 on standalone cluster mode.

inside program, could:

  1. throw exception , bubble main, surrounding main body in try catch
  2. put system.exit(0) want. if shutdowngracefully property set true, close gracefully ever spark process , end program. here reference

Comments

Popular posts from this blog

wordpress - (T_ENDFOREACH) php error -

Export Excel workseet into txt file using vba - (text and numbers with formulas) -

Using django-mptt to get only the categories that have items -