Quantcast
Viewing all articles
Browse latest Browse all 3435

Problem installing hadoop components using Ambari

I’m trying to install a 3-node HDP 2.2 cluster using Ambari. Host file, proxy configuration, ambari-server configuration all are done but I’m stuck at the last install step. When I do next and start the deployment to install all the components, the nodes show the following error:

stderr: /var/lib/ambari-agent/data/errors-590.txt

2015-05-04 10:46:46,820 – Error while executing command ‘install':
Traceback (most recent call last):
File “/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py”, line 214, in execute
method(env)
File “/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/datanode.py”, line 36, in install
self.install_packages(env, params.exclude_packages)
File “/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py”, line 289, in install_packages
Package(name)
File “/usr/lib/python2.6/site-packages/resource_management/core/base.py”, line 148, in __init__
self.env.run()
File “/usr/lib/python2.6/site-packages/resource_management/core/environment.py”, line 152, in run
self.run_action(resource, action)
File “/usr/lib/python2.6/site-packages/resource_management/core/environment.py”, line 118, in run_action
provider_action()
File “/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py”, line 43, in action_install
self.install_package(package_name, self.resource.use_repos)
File “/usr/lib/python2.6/site-packages/resource_management/core/providers/package/yumrpm.py”, line 51, in install_package
shell.checked_call(cmd, sudo=True, logoutput=self.get_logoutput())
File “/usr/lib/python2.6/site-packages/resource_management/core/shell.py”, line 70, in inner
return function(command, **kwargs)
File “/usr/lib/python2.6/site-packages/resource_management/core/shell.py”, line 82, in checked_call
return _call(command, logoutput, True, cwd, env, preexec_fn, user, wait_for_finish, timeout, path, sudo, on_new_line)
File “/usr/lib/python2.6/site-packages/resource_management/core/shell.py”, line 199, in _call
raise Fail(err_msg)
Fail: Execution of ‘/usr/bin/yum -d 0 -e 0 -y install ‘hadoop_2_2_*” returned 1. Error: Package: hadoop_2_2_4_2_2-hdfs-fuse-2.6.0.2.2.4.2-2.el6.x86_64 (HDP-2.2)
Requires: fuse
Error: Package: hadoop_2_2_4_2_2-2.6.0.2.2.4.2-2.el6.x86_64 (HDP-2.2)
Requires: nc
You could try using –skip-broken to work around the problem
You could try running: rpm -Va –nofiles –nodigest
stdout: /var/lib/ambari-agent/data/output-590.txt

2015-05-04 10:46:35,627 – u”Directory[‘/var/lib/ambari-agent/data/tmp/AMBARI-artifacts/’]” {‘recursive': True}
2015-05-04 10:46:35,814 – u”File[‘/var/lib/ambari-agent/data/tmp/AMBARI-artifacts//UnlimitedJCEPolicyJDK7.zip’]” {‘content': DownloadSource(‘http://smt-bi3.example.com:8080/resources//UnlimitedJCEPolicyJDK7.zip’)}
2015-05-04 10:46:35,915 – Not downloading the file from http://smt-bi3.example.com:8080/resources//UnlimitedJCEPolicyJDK7.zip, because /var/lib/ambari-agent/data/tmp/UnlimitedJCEPolicyJDK7.zip already exists
2015-05-04 10:46:36,083 – u”Group[‘hadoop’]” {‘ignore_failures': False}
2015-05-04 10:46:36,084 – Modifying group hadoop
2015-05-04 10:46:36,135 – u”Group[‘users’]” {‘ignore_failures': False}
2015-05-04 10:46:36,135 – Modifying group users
2015-05-04 10:46:36,185 – u”Group[‘knox’]” {‘ignore_failures': False}
2015-05-04 10:46:36,185 – Modifying group knox
2015-05-04 10:46:36,236 – u”Group[‘spark’]” {‘ignore_failures': False}
2015-05-04 10:46:36,236 – Modifying group spark
2015-05-04 10:46:36,286 – u”User[‘oozie’]” {‘gid': ‘hadoop’, ‘ignore_failures': False, ‘groups': [u’users’]}
2015-05-04 10:46:36,286 – Modifying user oozie
2015-05-04 10:46:36,338 – u”User[‘hive’]” {‘gid': ‘hadoop’, ‘ignore_failures': False, ‘groups': [u’hadoop’]}
2015-05-04 10:46:36,338 – Modifying user hive
2015-05-04 10:46:36,390 – u”User[‘ambari-qa’]” {‘gid': ‘hadoop’, ‘ignore_failures': False, ‘groups': [u’users’]}
2015-05-04 10:46:36,391 – Modifying user ambari-qa
2015-05-04 10:46:36,442 – u”User[‘flume’]” {‘gid': ‘hadoop’, ‘ignore_failures': False, ‘groups': [u’hadoop’]}
2015-05-04 10:46:36,443 – Modifying user flume
2015-05-04 10:46:36,497 – u”User[‘hdfs’]” {‘gid': ‘hadoop’, ‘ignore_failures': False, ‘groups': [u’hadoop’]}
2015-05-04 10:46:36,498 – Modifying user hdfs
2015-05-04 10:46:36,553 – u”User[‘knox’]” {‘gid': ‘hadoop’, ‘ignore_failures': False, ‘groups': [u’hadoop’]}
2015-05-04 10:46:36,554 – Modifying user knox
2015-05-04 10:46:36,616 – u”User[‘storm’]” {‘gid': ‘hadoop’, ‘ignore_failures': False, ‘groups': [u’hadoop’]}
2015-05-04 10:46:36,616 – Modifying user storm
2015-05-04 10:46:36,677 – u”User[‘spark’]” {‘gid': ‘hadoop’, ‘ignore_failures': False, ‘groups': [u’hadoop’]}
2015-05-04 10:46:36,678 – Modifying user spark
2015-05-04 10:46:36,738 – u”User[‘mapred’]” {‘gid': ‘hadoop’, ‘ignore_failures': False, ‘groups': [u’hadoop’]}
2015-05-04 10:46:36,739 – Modifying user mapred
2015-05-04 10:46:36,801 – u”User[‘hbase’]” {‘gid': ‘hadoop’, ‘ignore_failures': False, ‘groups': [u’hadoop’]}
2015-05-04 10:46:36,801 – Modifying user hbase
2015-05-04 10:46:36,862 – u”User[‘tez’]” {‘gid': ‘hadoop’, ‘ignore_failures': False, ‘groups': [u’users’]}
2015-05-04 10:46:36,863 – Modifying user tez
2015-05-04 10:46:36,925 – u”User[‘zookeeper’]” {‘gid': ‘hadoop’, ‘ignore_failures': False, ‘groups': [u’hadoop’]}
2015-05-04 10:46:36,925 – Modifying user zookeeper
2015-05-04 10:46:36,985 – u”User[‘kafka’]” {‘gid': ‘hadoop’, ‘ignore_failures': False, ‘groups': [u’hadoop’]}
2015-05-04 10:46:36,985 – Modifying user kafka
2015-05-04 10:46:37,036 – u”User[‘falcon’]” {‘gid': ‘hadoop’, ‘ignore_failures': False, ‘groups': [u’hadoop’]}
2015-05-04 10:46:37,036 – Modifying user falcon
2015-05-04 10:46:37,087 – u”User[‘sqoop’]” {‘gid': ‘hadoop’, ‘ignore_failures': False, ‘groups': [u’hadoop’]}
2015-05-04 10:46:37,087 – Modifying user sqoop
2015-05-04 10:46:37,139 – u”User[‘yarn’]” {‘gid': ‘hadoop’, ‘ignore_failures': False, ‘groups': [u’hadoop’]}
2015-05-04 10:46:37,139 – Modifying user yarn
2015-05-04 10:46:37,191 – u”User[‘hcat’]” {‘gid': ‘hadoop’, ‘ignore_failures': False, ‘groups': [u’hadoop’]}
2015-05-04 10:46:37,191 – Modifying user hcat
2015-05-04 10:46:37,243 – u”User[‘ams’]” {‘gid': ‘hadoop’, ‘ignore_failures': False, ‘groups': [u’hadoop’]}
2015-05-04 10:46:37,244 – Modifying user ams
2015-05-04 10:46:37,296 – u”File[‘/var/lib/ambari-agent/data/tmp/changeUid.sh’]” {‘content': StaticFile(‘changeToSecureUid.sh’), ‘mode': 0555}
2015-05-04 10:46:37,612 – u”Execute[‘/var/lib/ambari-agent/data/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa’]” {‘not_if': ‘(test $(id -u ambari-qa) -gt 1000) || (false)’}
2015-05-04 10:46:37,663 – Skipping u”Execute[‘/var/lib/ambari-agent/data/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa’]” due to not_if
2015-05-04 10:46:37,664 – u”Directory[‘/hadoop/hbase’]” {‘owner': ‘hbase’, ‘recursive': True, ‘mode': 0775, ‘cd_access': ‘a’}
2015-05-04 10:46:38,056 – u”File[‘/var/lib/ambari-agent/data/tmp/changeUid.sh’]” {‘content': StaticFile(‘changeToSecureUid.sh’), ‘mode': 0555}
2015-05-04 10:46:38,432 – u”Execute[‘/var/lib/ambari-agent/data/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/hadoop/hbase’]” {‘not_if': ‘(test $(id -u hbase) -gt 1000) || (false)’}
2015-05-04 10:46:38,482 – Skipping u”Execute[‘/var/lib/ambari-agent/data/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/hadoop/hbase’]” due to not_if
2015-05-04 10:46:38,482 – u”Group[‘hdfs’]” {‘ignore_failures': False}
2015-05-04 10:46:38,483 – Modifying group hdfs
2015-05-04 10:46:38,533 – u”User[‘hdfs’]” {‘ignore_failures': False, ‘groups': [u’hadoop’, ‘hadoop’, ‘hdfs’, u’hdfs’]}
2015-05-04 10:46:38,533 – Modifying user hdfs
2015-05-04 10:46:38,585 – u”Directory[‘/etc/hadoop’]” {‘mode': 0755}
2015-05-04 10:46:38,751 – u”Directory[‘/etc/hadoop/conf.empty’]” {‘owner': ‘root’, ‘group': ‘hadoop’, ‘recursive': True}
2015-05-04 10:46:38,920 – u”Link[‘/etc/hadoop/conf’]” {‘not_if': ‘ls /etc/hadoop/conf’, ‘to': ‘/etc/hadoop/conf.empty’}
2015-05-04 10:46:38,972 – Skipping u”Link[‘/etc/hadoop/conf’]” due to not_if
2015-05-04 10:46:38,985 – u”File[‘/etc/hadoop/conf/hadoop-env.sh’]” {‘content': InlineTemplate(…), ‘owner': ‘hdfs’, ‘group': ‘hadoop’}
2015-05-04 10:46:39,333 – u”Repository[‘HDP-2.2′]” {‘base_url': ‘http://public-repo-1.hortonworks.com/HDP/centos6/2.x/GA/2.2.0.0′, ‘action': [‘create’], ‘components': [u’HDP’, ‘main’], ‘repo_template': ‘repo_suse_rhel.j2′, ‘repo_file_name': ‘HDP’, ‘mirror_list': None}
2015-05-04 10:46:39,347 – u”File[‘/etc/yum.repos.d/HDP.repo’]” {‘content': Template(‘repo_suse_rhel.j2′)}
2015-05-04 10:46:39,719 – u”Repository[‘HDP-UTILS-1.1.0.20′]” {‘base_url': ‘http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.20/repos/centos6′, ‘action': [‘create’], ‘components': [u’HDP-UTILS’, ‘main’], ‘repo_template': ‘repo_suse_rhel.j2′, ‘repo_file_name': ‘HDP-UTILS’, ‘mirror_list': None}
2015-05-04 10:46:39,724 – u”File[‘/etc/yum.repos.d/HDP-UTILS.repo’]” {‘content': Template(‘repo_suse_rhel.j2′)}
2015-05-04 10:46:40,033 – u”Package[‘unzip’]” {}
2015-05-04 10:46:40,453 – Skipping installing existent package unzip
2015-05-04 10:46:40,454 – u”Package[‘curl’]” {}
2015-05-04 10:46:40,866 – Skipping installing existent package curl
2015-05-04 10:46:40,867 – u”Package[‘hdp-select’]” {}
2015-05-04 10:46:41,278 – Skipping installing existent package hdp-select
2015-05-04 10:46:41,279 – u”Directory[‘/var/lib/ambari-agent/data/tmp/AMBARI-artifacts/’]” {‘recursive': True}
2015-05-04 10:46:41,443 – u”File[‘/var/lib/ambari-agent/data/tmp/AMBARI-artifacts//jdk-7u67-linux-x64.tar.gz’]” {‘content': DownloadSource(‘http://smt-bi3.example.com:8080/resources//jdk-7u67-linux-x64.tar.gz’)}
2015-05-04 10:46:41,548 – Not downloading the file from http://smt-bi3.example.com:8080/resources//jdk-7u67-linux-x64.tar.gz, because /var/lib/ambari-agent/data/tmp/jdk-7u67-linux-x64.tar.gz already exists
2015-05-04 10:46:42,340 – u”Directory[‘/usr/jdk64′]” {}
2015-05-04 10:46:42,520 – u”Execute[‘(‘chmod’, ‘a+x’, u’/usr/jdk64′)’]” {‘not_if': ‘test -e /usr/jdk64/jdk1.7.0_67/bin/java’, ‘sudo': True}
2015-05-04 10:46:42,576 – Skipping u”Execute[‘(‘chmod’, ‘a+x’, u’/usr/jdk64′)’]” due to not_if
2015-05-04 10:46:42,577 – u”Execute[‘mkdir -p /var/lib/ambari-agent/data/tmp/jdk && cd /var/lib/ambari-agent/data/tmp/jdk && tar -xf /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//jdk-7u67-linux-x64.tar.gz && ambari-sudo.sh cp -rp /var/lib/ambari-agent/data/tmp/jdk/* /usr/jdk64′]” {‘not_if': ‘test -e /usr/jdk64/jdk1.7.0_67/bin/java’}
2015-05-04 10:46:42,633 – Skipping u”Execute[‘mkdir -p /var/lib/ambari-agent/data/tmp/jdk && cd /var/lib/ambari-agent/data/tmp/jdk && tar -xf /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//jdk-7u67-linux-x64.tar.gz && ambari-sudo.sh cp -rp /var/lib/ambari-agent/data/tmp/jdk/* /usr/jdk64′]” due to not_if
2015-05-04 10:46:42,633 – u”Execute[‘(‘chgrp’, ‘-R’, u’hadoop’, u’/usr/jdk64/jdk1.7.0_67′)’]” {‘sudo': True}
2015-05-04 10:46:42,689 – u”Execute[‘(‘chown’, ‘-R’, ‘root’, u’/usr/jdk64/jdk1.7.0_67′)’]” {‘sudo': True}
2015-05-04 10:46:43,058 – u”Package[‘hadoop_2_2_*’]” {}
2015-05-04 10:46:43,491 – Installing package hadoop_2_2_* (‘/usr/bin/yum -d 0 -e 0 -y install ‘hadoop_2_2_*”)
2015-05-04 10:46:46,820 – Error while executing command ‘install':
Traceback (most recent call last):
File “/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py”, line 214, in execute
method(env)
File “/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/datanode.py”, line 36, in install
self.install_packages(env, params.exclude_packages)
File “/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py”, line 289, in install_packages
Package(name)
File “/usr/lib/python2.6/site-packages/resource_management/core/base.py”, line 148, in __init__
self.env.run()
File “/usr/lib/python2.6/site-packages/resource_management/core/environment.py”, line 152, in run
self.run_action(resource, action)
File “/usr/lib/python2.6/site-packages/resource_management/core/environment.py”, line 118, in run_action
provider_action()
File “/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py”, line 43, in action_install
self.install_package(package_name, self.resource.use_repos)
File “/usr/lib/python2.6/site-packages/resource_management/core/providers/package/yumrpm.py”, line 51, in install_package
shell.checked_call(cmd, sudo=True, logoutput=self.get_logoutput())
File “/usr/lib/python2.6/site-packages/resource_management/core/shell.py”, line 70, in inner
return function(command, **kwargs)
File “/usr/lib/python2.6/site-packages/resource_management/core/shell.py”, line 82, in checked_call
return _call(command, logoutput, True, cwd, env, preexec_fn, user, wait_for_finish, timeout, path, sudo, on_new_line)
File “/usr/lib/python2.6/site-packages/resource_management/core/shell.py”, line 199, in _call
raise Fail(err_msg)
Fail: Execution of ‘/usr/bin/yum -d 0 -e 0 -y install ‘hadoop_2_2_*” returned 1. Error: Package: hadoop_2_2_4_2_2-hdfs-fuse-2.6.0.2.2.4.2-2.el6.x86_64 (HDP-2.2)
Requires: fuse
Error: Package: hadoop_2_2_4_2_2-2.6.0.2.2.4.2-2.el6.x86_64 (HDP-2.2)
Requires: nc
You could try using –skip-broken to work around the problem
You could try running: rpm -Va –nofiles –nodigest
2015-05-04 10:46:46,872 – Command: /usr/bin/hdp-select status hadoop-hdfs-datanode > /tmp/tmpR384ET
Output: hadoop-hdfs-datanode – None

what is the reason for this issue ? oh and all the nodes are connected to the internet.


Viewing all articles
Browse latest Browse all 3435

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>