Hi All,
I have the same issue on a Hadoop cluster setup on EC2 using Ambari. I ssh to one of the nodes where Spark client is installed (as shown by Ambari Web UI) and then try –
[ec2-user@ip-xx-xx-xx-xx ~] su spark
Password:
Without switching to user spark, the spark-submit command (even with sudo) fails due to permission issues:
Error: application failed with exception
org.apache.hadoop.security.AccessControlException: Permission denied: user=root, access=WRITE, inode=”/user/root/.sparkStaging/application_1439970529432_0027″:hdfs:hdfs:drwxr-xr-x
The cluster has HDP 2.3 installed using Ambari 2.1.
spark-shell works fine at user ec2-user. Why is switching to user spark required to submit Spark jobs?
Any pointers on this would help a lot.
Regards,
Ram