Quantcast
Viewing all articles
Browse latest Browse all 3435

Data node installation problem Ambari server 1.7

I’ve installed Ambari server 1.7 with two slave nodes. Everything goes well but in the Install, Start, Test I will get this error for my datanode:

I searched a lot but I couldn’t find the solution. Any comments will be appreciated. All my nodes are a VirtualBox.
Thank you Image may be NSFW.
Clik here to view.
:)


stderr:
2015-11-28 18:40:03,985 - Error while executing command 'install':
Traceback (most recent call last):
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 123, in execute
method(env)
File "/var/lib/ambari-agent/cache/stacks/PHD/2.0.6/services/HDFS/package/scripts/datanode.py", line 29, in install
self.install_packages(env, params.exclude_packages)
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 188, in install_packages
Package(name)
File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 148, in __init__
self.env.run()
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 149, in run
self.run_action(resource, action)
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 115, in run_action
provider_action()
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py", line 40, in action_install
self.install_package(package_name)
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/yumrpm.py", line 36, in install_package
shell.checked_call(cmd)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 36, in checked_call
return _call(command, logoutput, True, cwd, env, preexec_fn, user, wait_for_finish, timeout, path)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 102, in _call
raise Fail(err_msg)
Fail: Execution of '/usr/bin/yum -d 0 -e 0 -y install hadoop_3_0_*' returned 1. Error: Package: zookeeper_3_0_0_0_249-3.4.6.3.0.0.0-249.noarch (PHD-3.0)
Requires: update-alternatives
Error: Package: hadoop_3_0_0_0_249-2.6.0.3.0.0.0-249.x86_64 (PHD-3.0)
Requires: insserv
Error: Package: hadoop_3_0_0_0_249-hdfs-fuse-2.6.0.3.0.0.0-249.x86_64 (PHD-3.0)
Requires: libfuse2
Error: Package: hadoop_3_0_0_0_249-2.6.0.3.0.0.0-249.x86_64 (PHD-3.0)
Requires: netcat-openbsd
You could try using --skip-broken to work around the problem
You could try running: rpm -Va --nofiles --nodigest
stdout:
2015-11-28 18:40:01,141 - Could not verify stack version by calling '/usr/bin/distro-select versions > /tmp/tmpu_RGe8'. Return Code: 1, Output: .
2015-11-28 18:40:01,146 - Execute['mkdir -p /var/lib/ambari-agent/data/tmp/AMBARI-artifacts/; curl -kf -x "" --retry 10 http://ambari.localdomain:8080/resources//UnlimitedJCEPolicyJDK7.zip -o /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//UnlimitedJCEPolicyJDK7.zip'] {'environment': ..., 'not_if': 'test -e /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//UnlimitedJCEPolicyJDK7.zip', 'ignore_failures': True, 'path': ['/bin', '/usr/bin/']}
2015-11-28 18:40:01,152 - Skipping Execute['mkdir -p /var/lib/ambari-agent/data/tmp/AMBARI-artifacts/; curl -kf -x "" --retry 10 http://ambari.localdomain:8080/resources//UnlimitedJCEPolicyJDK7.zip -o /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//UnlimitedJCEPolicyJDK7.zip'] due to not_if
2015-11-28 18:40:01,152 - Group['hadoop'] {'ignore_failures': False}
2015-11-28 18:40:01,153 - Modifying group hadoop
2015-11-28 18:40:01,168 - Group['users'] {'ignore_failures': False}
2015-11-28 18:40:01,168 - Modifying group users
2015-11-28 18:40:01,189 - User['ambari-qa'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'users']}
2015-11-28 18:40:01,189 - Modifying user ambari-qa
2015-11-28 18:40:01,200 - User['zookeeper'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2015-11-28 18:40:01,200 - Modifying user zookeeper
2015-11-28 18:40:01,207 - User['hdfs'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2015-11-28 18:40:01,208 - Modifying user hdfs
2015-11-28 18:40:01,218 - File['/var/lib/ambari-agent/data/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2015-11-28 18:40:01,219 - Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 2>/dev/null'] {'not_if': 'test $(id -u ambari-qa) -gt 1000'}
2015-11-28 18:40:01,226 - Skipping Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 2>/dev/null'] due to not_if
2015-11-28 18:40:01,226 - Directory['/etc/hadoop/conf.empty'] {'owner': 'root', 'group': 'root', 'recursive': True}
2015-11-28 18:40:01,227 - Link['/etc/hadoop/conf'] {'not_if': 'ls /etc/hadoop/conf', 'to': '/etc/hadoop/conf.empty'}
2015-11-28 18:40:01,235 - Skipping Link['/etc/hadoop/conf'] due to not_if
2015-11-28 18:40:01,249 - File['/etc/hadoop/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs'}
2015-11-28 18:40:01,264 - Repository['PHD-3.0'] {'base_url': 'http://ambari.localdomain/PHD-3.0.0.0', 'action': ['create'], 'components': [u'PHD', 'main'], 'repo_template': 'repo_suse_rhel.j2', 'repo_file_name': 'PHD', 'mirror_list': None}
2015-11-28 18:40:01,271 - File['/etc/yum.repos.d/PHD.repo'] {'content': Template('repo_suse_rhel.j2')}
2015-11-28 18:40:01,272 - Repository['PHD-UTILS-1.1.0.20'] {'base_url': 'http://ambari.localdomain/PHD-UTILS-1.1.0.20', 'action': ['create'], 'components': [u'PHD-UTILS', 'main'], 'repo_template': 'repo_suse_rhel.j2', 'repo_file_name': 'PHD-UTILS', 'mirror_list': None}
2015-11-28 18:40:01,274 - File['/etc/yum.repos.d/PHD-UTILS.repo'] {'content': Template('repo_suse_rhel.j2')}
2015-11-28 18:40:01,275 - Repository['PADS-1.3.0.0'] {'base_url': 'http://ambari.localdomain/PADS-1.3.0.0', 'action': ['create'], 'components': [u'PADS-1.3.0.0', 'main'], 'repo_template': 'repo_suse_rhel.j2', 'repo_file_name': 'PADS-1.3.0.0', 'mirror_list': None}
2015-11-28 18:40:01,279 - File['/etc/yum.repos.d/PADS-1.3.0.0.repo'] {'content': Template('repo_suse_rhel.j2')}
2015-11-28 18:40:01,280 - Package['unzip'] {}
2015-11-28 18:40:01,485 - Skipping installing existent package unzip
2015-11-28 18:40:01,485 - Package['curl'] {}
2015-11-28 18:40:01,693 - Skipping installing existent package curl
2015-11-28 18:40:01,693 - Package['distro-select'] {}
2015-11-28 18:40:01,924 - Skipping installing existent package distro-select
2015-11-28 18:40:01,928 - Execute['mkdir -p /var/lib/ambari-agent/data/tmp/AMBARI-artifacts/ ; curl -kf -x "" --retry 10 http://ambari.localdomain:8080/resources//jdk-7u67-linux-x64.tar.gz -o /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//jdk-7u67-linux-x64.tar.gz'] {'environment': ..., 'not_if': 'test -e /usr/jdk64/jdk1.7.0_67/bin/java', 'path': ['/bin', '/usr/bin/']}
2015-11-28 18:40:01,935 - Skipping Execute['mkdir -p /var/lib/ambari-agent/data/tmp/AMBARI-artifacts/ ; curl -kf -x "" --retry 10 http://ambari.localdomain:8080/resources//jdk-7u67-linux-x64.tar.gz -o /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//jdk-7u67-linux-x64.tar.gz'] due to not_if
2015-11-28 18:40:01,935 - Execute['mkdir -p /usr/jdk64 ; cd /usr/jdk64 ; tar -xf /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//jdk-7u67-linux-x64.tar.gz > /dev/null 2>&1'] {'not_if': 'test -e /usr/jdk64/jdk1.7.0_67/bin/java', 'path': ['/bin', '/usr/bin/']}
2015-11-28 18:40:01,941 - Skipping Execute['mkdir -p /usr/jdk64 ; cd /usr/jdk64 ; tar -xf /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//jdk-7u67-linux-x64.tar.gz > /dev/null 2>&1'] due to not_if
2015-11-28 18:40:02,102 - Could not verify stack version by calling '/usr/bin/distro-select versions > /tmp/tmpyC6N0e'. Return Code: 1, Output: .
2015-11-28 18:40:02,106 - Package['hadoop_3_0_*'] {}
2015-11-28 18:40:02,307 - Installing package hadoop_3_0_* ('/usr/bin/yum -d 0 -e 0 -y install hadoop_3_0_*')
2015-11-28 18:40:03,985 - Error while executing command 'install':
Traceback (most recent call last):
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 123, in execute
method(env)
File "/var/lib/ambari-agent/cache/stacks/PHD/2.0.6/services/HDFS/package/scripts/datanode.py", line 29, in install
self.install_packages(env, params.exclude_packages)
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 188, in install_packages
Package(name)
File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 148, in __init__
self.env.run()
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 149, in run
self.run_action(resource, action)
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 115, in run_action
provider_action()
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py", line 40, in action_install
self.install_package(package_name)
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/yumrpm.py", line 36, in install_package
shell.checked_call(cmd)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 36, in checked_call
return _call(command, logoutput, True, cwd, env, preexec_fn, user, wait_for_finish, timeout, path)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 102, in _call
raise Fail(err_msg)
Fail: Execution of '/usr/bin/yum -d 0 -e 0 -y install hadoop_3_0_*' returned 1. Error: Package: zookeeper_3_0_0_0_249-3.4.6.3.0.0.0-249.noarch (PHD-3.0)
Requires: update-alternatives
Error: Package: hadoop_3_0_0_0_249-2.6.0.3.0.0.0-249.x86_64 (PHD-3.0)
Requires: insserv
Error: Package: hadoop_3_0_0_0_249-hdfs-fuse-2.6.0.3.0.0.0-249.x86_64 (PHD-3.0)
Requires: libfuse2
Error: Package: hadoop_3_0_0_0_249-2.6.0.3.0.0.0-249.x86_64 (PHD-3.0)
Requires: netcat-openbsd
You could try using --skip-broken to work around the problem
You could try running: rpm -Va --nofiles --nodigest

Viewing all articles
Browse latest Browse all 3435

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>