Quantcast
Channel: Hortonworks » All Replies
Viewing all articles
Browse latest Browse all 3435

ambari 1.6.1 — install of hdfs fails

$
0
0

my stack: centos6, epel, postgres, ambari 1.6.1, HDP 2.1, etc. Iptables and selinux are disabled
configuration: 3 nodes; standalone postgres for ambari/hive/oozie

For ambari-server setup I use the defaults, except for:

  • create daemon user ‘ambari’ (same problem occurs when i use root)
  • use standalone postgres db

During cluster create I use all defaults except these:

  • unselect hive, hbase, storm
  • slaves – nodes 2 & 3 run datanode and nodemanager; all nodes run the client
  • customize – provide required config info for oozie and nagios)

The cluster create (via ambari web) succeeds with warnings. Some services (ganglia, zookeeper) are successfully installed/started. But HDFS fails with this error

2015-06-29 14:14:06,513 - Error while executing command 'start':
Traceback (most recent call last):
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 111, in execute
method(env)
File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HDFS/package/scripts/namenode.py", line 39, in start
namenode(action="start")
File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HDFS/package/scripts/hdfs_namenode.py", line 45, in namenode
create_log_dir=True
File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HDFS/package/scripts/utils.py", line 62, in service
not_if=service_is_up
File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 148, in __init__
self.env.run()
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 149, in run
self.run_action(resource, action)
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 115, in run_action
provider_action()
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 239, in action_run
raise ex
Fail: Execution of 'ulimit -c unlimited; export HADOOP_LIBEXEC_DIR=/usr/lib/hadoop/libexec && /usr/lib/hadoop/sbin/hadoop-daemon.sh --config /etc/hadoop/conf start namenode' returned 127. -bash: /usr/lib/hadoop/sbin/hadoop-daemon.sh: No such file or directory

This directory ‘ /usr/lib/hadoop/sbin’ does not even exist on any node!

Anyone got ideas about how to fix this?

Also, is ambari 1.6.1 solid? Or should I start over with Ambari 1.7 and HDP 2.2?


Viewing all articles
Browse latest Browse all 3435

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>