Quantcast
Channel: Hortonworks » All Replies
Viewing all articles
Browse latest Browse all 3435

Required executor memory is above the max threshold of this cluster

$
0
0

Hello Everyone,

We would like to know how to change the executor memory threshold. We have installed SAP HANA Spark Controller on our HDP 2.3.2 (requirement for SAP HANA VORA) and tried to start spark controller.

$ ./hanaes start
Starting HANA Spark Controller … Class path is /usr/sap/spark/controller/bin/../conf:/usr/hdp/2.3.2.0-2950/hadoop/conf:/etc/hive/conf:../*:../lib/*:/usr/hdp/2.3.2.0-2950/hadoop/*:/usr/hdp/2.3.2.0-2950/hadoop/lib/*:/usr/hdp/2.3.2.0-2950/hadoop-hdfs/*:/usr/hdp/2.3.2.0-2950/hadoop-hdfs/lib/*
STARTED

We have checked the logs to make sure it is started successfully. But upon checking there is an error:

2015-11-26 16:52:46,734 [ERROR] Error initializing SparkContext.
java.lang.IllegalArgumentException: Required executor memory (1024+384 MB) is above th
e max threshold (512 MB) of this cluster!

Kindly advise. Thank you!

Regards,
Rebella


Viewing all articles
Browse latest Browse all 3435

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>