You need to use multiple -D: params in this case. E.g: -D:spark.driver.memory=8g -D:spark.executor.memory=4g. Re: the NullReferenceException, due to the way that Mahout spin up its Spark context, spark.master=yarn-cluster isn’t supported. As of Spark 1.4, this now gives a more useful error message – https://issues.apache.org/jira/browse/SPARK-7504.
↧