Quantcast
Channel: Hortonworks » All Replies
Viewing all articles
Browse latest Browse all 3435

DataNode not started after changing the datanode directories parameter

$
0
0

Hello, I have added the new disk to the hortonworks OracleVM, following this example.

https://muffinresearch.co.uk/adding-more-disk-space-to-a-linux-virtual-machine/

I set the owner of the mounted disk directory as hdfs:hadoop recursivly and give the 777 permisions to it.

I added the mounted disk folder to the datanode directories after coma using Ambari. Also tried changing xml directly.

And after the restart the dataNode always crashes with the DiskErrorException Too many failed volumes.

If there is posibility to ad disk to the Horton works Sandbox, and if yes what I am doing wrong?


Viewing all articles
Browse latest Browse all 3435

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>