Hi Nigel,
I have installed HDP2.2 on Ambari 1.7 on a VM of single node. I am facing the same error you mentioned in this post, for HDFS datanode.
The hadoop datanode log file error message is :
FATAL datanode.DataNode (BPServiceActor.java:run(878)) – Initialization failed for Block pool <registering> (Datanode Uuid unassigned) service to hdp-hw-sandbox.cloudapp.net/1xx.11x.xx.xx:xxxx. Exiting.
org.apache.hadoop.util.DiskChecker$DiskErrorException: Invalid volume failure config value: 1
at org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl.<init>(FsDatasetImpl.java:257)
at org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetFactory.newInstance(FsDatasetFactory.java:34)
at org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetFactory.newInstance(FsDatasetFactory.java:30)
at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1349)
at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1301)
at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:314)
at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:225)
at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:866)
at java.lang.Thread.run(Thread.java:745)
I did a namenode format and tried, but no luck.
Any help is appreciated.
Thanks a lot