Quantcast
Channel: Hortonworks » All Replies
Viewing all articles
Browse latest Browse all 3435

Reply To: Spark Pi Example FAILED exitCode 11 : Max number of executor failures reached

$
0
0

Hi Vinay,
No its not sandbox, I installed HDP2.2 on CentOS6.6 (64 bit) single-node and installed and configured Apache Spark 1.2.0 as per link http://hortonworks.com/hadoop-tutorial/using-apache-spark-hdp/ .
The property “max.executor.failures / (spark.yarn.max.executor.failures)” is not in config file “spark-defaults.conf” located in “/spark-1.2.0.2.2.0.0-82-bin-2.6.0.2.2.0.0-2041/conf”. It has only these two properties “spark.driver.extraJavaOptions -Dhdp.version” and “spark.yarn.am.extraJavaOptions -Dhdp.version”

Hi Vinay,
No its not sandbox, I installed HDP2.2 on CentOS6.6 (64 bit) single-node and installed and configured Apache Spark 1.2.0 as per link http://hortonworks.com/hadoop-tutorial/using-apache-spark-hdp/ .
The property “max.executor.failures / (spark.yarn.max.executor.failures)” is not in config file “spark-defaults.conf” located in “/spark-1.2.0.2.2.0.0-82-bin-2.6.0.2.2.0.0-2041/conf”. It has only these two properties “spark.driver.extraJavaOptions -Dhdp.version” and “spark.yarn.am.extraJavaOptions -Dhdp.version”

However while waiting for response from the form, I managed to resolve this issue in two way
(1. ) Added option “–conf spark.eventLog.overwrite=true” to “./bin/spark-submit” and it worked without any errors
(2. ) I rebooted my VM and ran same program using same command as per http://hortonworks.com/hadoop-tutorial/using-apache-spark-hdp/, without “–conf spark.eventLog.overwrite=true” and it worked without any errors

Please advice if I have add some of these properties as mentioned in https://spark.apache.org/docs/1.2.0/running-on-yarn.html to “spark-defaults.conf”

thanks


Viewing all articles
Browse latest Browse all 3435

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>