Quantcast
Channel: Hortonworks » All Replies
Viewing all articles
Browse latest Browse all 3435

Reply To: Unable to start any components on cluster

$
0
0

I resolved the issue. In the .log file I kept seeing:


(DataNode.java:2402)
2015-05-14 11:49:51,657 WARN datanode.DataNode (DataNode.java:checkStorageLocations(2284)) - Invalid dfs.datanode.data.dir /mnt/Data3/hadoop/hdfs :
EPERM: Operation not permitted
at org.apache.hadoop.io.nativeio.NativeIO$POSIX.chmodImpl(Native Method)
at org.apache.hadoop.io.nativeio.NativeIO$POSIX.chmod(NativeIO.java:230)
at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:726)

First off, the hadoop/hdfs directory was not created. So i created that, and then did a chown -R hdfs:hadoop to the hdfs directory and now the datanodes work.

So it seems as though Ambari does not set this up during the install which I mistakenly thought. So I’ll have to go through every component, find where the configurations don’t line up and manually do some installation fixes.

Thanks for the guidance


Viewing all articles
Browse latest Browse all 3435

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>