Quantcast
Channel: Hortonworks » All Replies
Viewing all articles
Browse latest Browse all 3435

wrong file permissions on /etc/hadoop

$
0
0

Hi
I’m doing an Ambari based multi-node install.
for some reason, the file permissions are not set properly and do not allow hdfs and other service users to read and write into their folders.

$ su hdfs -c 'ls /etc/hadoop/conf/'
ls: cannot open directory /etc/hadoop/conf/: Permission denied
$ chown -hR hdfs:hadoop /etc/hadoop/
$ su hdfs -c 'ls /etc/hadoop/conf/'
capacity-scheduler.xml      dfs_data_dir_mount.hist  hadoop-metrics2.properties  health_check      mapred-site.xml     task-log4j.properties  yarn-site.xml
commons-logging.properties  dfs.exclude              hadoop-policy.xml           log4j.properties  slaves              yarn-env.sh
core-site.xml               hadoop-env.sh            hdfs-site.xml               mapred-env.sh     taskcontroller.cfg  yarn.exclude

when I run the namenode service from Ambari, it changes the folder permissions again


su hdfs -c 'ls /etc/hadoop/conf/'
ls: cannot open directory /etc/hadoop/conf/: Permission denied

Viewing all articles
Browse latest Browse all 3435

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>