spark streaming getts kill after 2hr -



spark streaming getts kill after 2hr -

i requirement run spark streaming 24x7. trying accomplish both using standalone mode , yarn cluster mode. problem streaming session gets killed after period of time.

i streaming 300,000 records in 1/2 hr through kafka, apply action , transformation logic , utilize updatestatebykey have historical data. case streaming app gets killed within 1 hr

i array out of bound exception, task killed

i started streaming app , did not stream test if spark running 24x7 1 time again got killed in 2hr, no log traces knew job got killed through ui localhost:8080

the workers , not available

then lost workers

14/10/29 13:52:03 error taskschedulerimpl: lost executor 1 on 172.18.152.36: worker lost 14/10/29 13:52:04 error taskschedulerimpl: lost executor 0 on 172.18.152.36: worker lost

in above 2 senario not able spark streaming running 24/7 possible ? how done, share thoughts.

apache-spark spark-streaming

Comments

Popular posts from this blog

xslt - DocBook 5 to PDF transform failing with error: "fo:flow" is missing child elements. Required content model: marker* -

mediawiki - How do I insert tables inside infoboxes on Wikia pages? -

Local Service User Logged into Windows -