Quantcast
Channel: Hortonworks » All Replies
Viewing all articles
Browse latest Browse all 3435

Installation SPARK component fails

$
0
0

Hi,

i have an 8 node ambari cluster. (Ambari: 2.1.0, HDP-2.3.0.0-2557)

1 Namenode, 1 SecondNamenode and 6 Datanodes.

The spark service is installed on the 6 Datanodes.

The next step was to install spark (Spark 1.3.1.2.3) on the namenode. Therefore i used the AMBARI Web-UI HOSTS->NamenodeHosts -> Add component

The installations did not process after 35%. I stopped the installation.

Now i want to reinstall the spark component. However it is not possible to remove the spark component on the namenode by using the WEB-UI

 

The log from the failed installation:
<h5><span id=”i18n-715″>stdout</span>:   <span class=”muted”> /var/lib/ambari-agent/data/output-1245.txt </span></h5>
<pre class=”stdout”>2015-11-11 09:15:12,699 – Directory[‘/var/lib/ambari-agent/data/tmp/AMBARI-artifacts/’] {‘recursive’: True}
2015-11-11 09:15:12,700 – File[‘/var/lib/ambari-agent/data/tmp/AMBARI-artifacts//jce_policy-8.zip’] {‘content’: DownloadSource(‘http://tolkien1.ielc:8080/resources//jce_policy-8.zip’)}
2015-11-11 09:15:12,701 – Not downloading the file from http://tolkien1.ielc:8080/resources//jce_policy-8.zip, because /var/lib/ambari-agent/data/tmp/jce_policy-8.zip already exists
2015-11-11 09:15:12,701 – Group[‘spark’] {‘ignore_failures’: False}
2015-11-11 09:15:12,701 – Group[‘hadoop’] {‘ignore_failures’: False}
2015-11-11 09:15:12,701 – Group[‘users’] {‘ignore_failures’: False}
2015-11-11 09:15:12,701 – Group[‘knox’] {‘ignore_failures’: False}
2015-11-11 09:15:12,702 – User[‘hive’] {‘gid’: ‘hadoop’, ‘ignore_failures’: False, ‘groups’: [‘hadoop’]}
2015-11-11 09:15:12,702 – User[‘storm’] {‘gid’: ‘hadoop’, ‘ignore_failures’: False, ‘groups’: [‘hadoop’]}
2015-11-11 09:15:12,702 – User[‘zookeeper’] {‘gid’: ‘hadoop’, ‘ignore_failures’: False, ‘groups’: [‘hadoop’]}
2015-11-11 09:15:12,703 – User[‘oozie’] {‘gid’: ‘hadoop’, ‘ignore_failures’: False, ‘groups’: [‘users’]}
2015-11-11 09:15:12,703 – User[‘atlas’] {‘gid’: ‘hadoop’, ‘ignore_failures’: False, ‘groups’: [‘hadoop’]}
2015-11-11 09:15:12,703 – User[‘ams’] {‘gid’: ‘hadoop’, ‘ignore_failures’: False, ‘groups’: [‘hadoop’]}
2015-11-11 09:15:12,704 – User[‘falcon’] {‘gid’: ‘hadoop’, ‘ignore_failures’: False, ‘groups’: [‘users’]}
2015-11-11 09:15:12,704 – User[‘tez’] {‘gid’: ‘hadoop’, ‘ignore_failures’: False, ‘groups’: [‘users’]}
2015-11-11 09:15:12,704 – User[‘accumulo’] {‘gid’: ‘hadoop’, ‘ignore_failures’: False, ‘groups’: [‘hadoop’]}
2015-11-11 09:15:12,705 – User[‘mahout’] {‘gid’: ‘hadoop’, ‘ignore_failures’: False, ‘groups’: [‘hadoop’]}
2015-11-11 09:15:12,705 – User[‘spark’] {‘gid’: ‘hadoop’, ‘ignore_failures’: False, ‘groups’: [‘hadoop’]}
2015-11-11 09:15:12,705 – User[‘ambari-qa’] {‘gid’: ‘hadoop’, ‘ignore_failures’: False, ‘groups’: [‘users’]}
2015-11-11 09:15:12,706 – User[‘flume’] {‘gid’: ‘hadoop’, ‘ignore_failures’: False, ‘groups’: [‘hadoop’]}
2015-11-11 09:15:12,706 – User[‘kafka’] {‘gid’: ‘hadoop’, ‘ignore_failures’: False, ‘groups’: [‘hadoop’]}
2015-11-11 09:15:12,706 – User[‘hdfs’] {‘gid’: ‘hadoop’, ‘ignore_failures’: False, ‘groups’: [‘hadoop’]}
2015-11-11 09:15:12,707 – User[‘sqoop’] {‘gid’: ‘hadoop’, ‘ignore_failures’: False, ‘groups’: [‘hadoop’]}
2015-11-11 09:15:12,707 – User[‘yarn’] {‘gid’: ‘hadoop’, ‘ignore_failures’: False, ‘groups’: [‘hadoop’]}
2015-11-11 09:15:12,708 – User[‘mapred’] {‘gid’: ‘hadoop’, ‘ignore_failures’: False, ‘groups’: [‘hadoop’]}
2015-11-11 09:15:12,708 – User[‘hbase’] {‘gid’: ‘hadoop’, ‘ignore_failures’: False, ‘groups’: [‘hadoop’]}
2015-11-11 09:15:12,708 – User[‘knox’] {‘gid’: ‘hadoop’, ‘ignore_failures’: False, ‘groups’: [‘hadoop’]}
2015-11-11 09:15:12,709 – User[‘hcat’] {‘gid’: ‘hadoop’, ‘ignore_failures’: False, ‘groups’: [‘hadoop’]}
2015-11-11 09:15:12,709 – File[‘/var/lib/ambari-agent/data/tmp/changeUid.sh’] {‘content’: StaticFile(‘changeToSecureUid.sh’), ‘mode’: 0555}
2015-11-11 09:15:12,710 – Execute[‘/var/lib/ambari-agent/data/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa’] {‘not_if’: ‘(test $(id -u ambari-qa) -gt 1000) || (false)’}
2015-11-11 09:15:12,712 – Skipping Execute[‘/var/lib/ambari-agent/data/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa’] due to not_if
2015-11-11 09:15:12,713 – Directory[‘/tmp/hbase-hbase’] {‘owner’: ‘hbase’, ‘recursive’: True, ‘mode’: 0775, ‘cd_access’: ‘a’}
2015-11-11 09:15:12,713 – File[‘/var/lib/ambari-agent/data/tmp/changeUid.sh’] {‘content’: StaticFile(‘changeToSecureUid.sh’), ‘mode’: 0555}
2015-11-11 09:15:12,714 – Execute[‘/var/lib/ambari-agent/data/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase’] {‘not_if’: ‘(test $(id -u hbase) -gt 1000) || (false)’}
2015-11-11 09:15:12,716 – Skipping Execute[‘/var/lib/ambari-agent/data/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase’] due to not_if
2015-11-11 09:15:12,717 – Group[‘hdfs’] {‘ignore_failures’: False}
2015-11-11 09:15:12,717 – User[‘hdfs’] {‘ignore_failures’: False, ‘groups’: [‘hadoop’, ‘hdfs’]}
2015-11-11 09:15:12,717 – Directory[‘/etc/hadoop’] {‘mode’: 0755}
2015-11-11 09:15:12,727 – File[‘/usr/hdp/current/hadoop-client/conf/hadoop-env.sh’] {‘content’: InlineTemplate(…), ‘owner’: ‘hdfs’, ‘group’: ‘hadoop’}
2015-11-11 09:15:12,736 – Repository[‘HDP-2.3’] {‘base_url’: ‘http://public-repo-1.hortonworks.com/HDP/centos6/2.x/updates/2.3.0.0’, ‘action’: [‘create’], ‘components’: [‘HDP’, ‘main’], ‘repo_template’: ‘[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0’, ‘repo_file_name’: ‘HDP’, ‘mirror_list’: None}
2015-11-11 09:15:12,764 – File[‘/etc/yum.repos.d/HDP.repo’] {‘content’: InlineTemplate(…)}
2015-11-11 09:15:12,783 – Repository[‘HDP-UTILS-1.1.0.20’] {‘base_url’: ‘http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.20/repos/centos6’, ‘action’: [‘create’], ‘components’: [‘HDP-UTILS’, ‘main’], ‘repo_template’: ‘[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0’, ‘repo_file_name’: ‘HDP-UTILS’, ‘mirror_list’: None}
2015-11-11 09:15:12,785 – File[‘/etc/yum.repos.d/HDP-UTILS.repo’] {‘content’: InlineTemplate(…)}
2015-11-11 09:15:12,786 – Package[‘unzip’] {}
2015-11-11 09:15:12,845 – Skipping installation of existing package unzip
2015-11-11 09:15:12,845 – Package[‘curl’] {}
2015-11-11 09:15:12,850 – Skipping installation of existing package curl
2015-11-11 09:15:12,850 – Package[‘hdp-select’] {}
2015-11-11 09:15:12,856 – Skipping installation of existing package hdp-select
2015-11-11 09:15:12,856 – Directory[‘/var/lib/ambari-agent/data/tmp/AMBARI-artifacts/’] {‘recursive’: True}
2015-11-11 09:15:12,856 – File[‘/var/lib/ambari-agent/data/tmp/jdk-8u40-linux-x64.tar.gz’] {‘content’: DownloadSource(‘http://tolkien1.ielc:8080/resources//jdk-8u40-linux-x64.tar.gz’), ‘not_if’: ‘test -f /var/lib/ambari-agent/data/tmp/jdk-8u40-linux-x64.tar.gz’}
2015-11-11 09:15:12,858 – Skipping File[‘/var/lib/ambari-agent/data/tmp/jdk-8u40-linux-x64.tar.gz’] due to not_if
2015-11-11 09:15:12,859 – Directory[‘/usr/jdk64’] {}
2015-11-11 09:15:12,859 – Execute[‘(‘chmod’, ‘a+x’, ‘/usr/jdk64′)’] {‘not_if’: ‘test -e /usr/jdk64/jdk1.8.0_40/bin/java’, ‘sudo’: True}
2015-11-11 09:15:12,861 – Skipping Execute[‘(‘chmod’, ‘a+x’, ‘/usr/jdk64′)’] due to not_if
2015-11-11 09:15:12,861 – Execute[‘mkdir -p /var/lib/ambari-agent/data/tmp/jdk && cd /var/lib/ambari-agent/data/tmp/jdk && tar -xf /var/lib/ambari-agent/data/tmp/jdk-8u40-linux-x64.tar.gz && ambari-sudo.sh cp -rp /var/lib/ambari-agent/data/tmp/jdk/* /usr/jdk64’] {‘not_if’: ‘test -e /usr/jdk64/jdk1.8.0_40/bin/java’}
2015-11-11 09:15:12,863 – Skipping Execute[‘mkdir -p /var/lib/ambari-agent/data/tmp/jdk && cd /var/lib/ambari-agent/data/tmp/jdk && tar -xf /var/lib/ambari-agent/data/tmp/jdk-8u40-linux-x64.tar.gz && ambari-sudo.sh cp -rp /var/lib/ambari-agent/data/tmp/jdk/* /usr/jdk64’] due to not_if
2015-11-11 09:15:12,863 – File[‘/usr/jdk64/jdk1.8.0_40/bin/java’] {‘mode’: 0755, ‘cd_access’: ‘a’}
2015-11-11 09:15:12,864 – Execute[‘(‘chgrp’, ‘-R’, ‘hadoop’, ‘/usr/jdk64/jdk1.8.0_40′)’] {‘sudo’: True}
2015-11-11 09:15:13,168 – Execute[‘(‘chown’, ‘-R’, ‘root’, ‘/usr/jdk64/jdk1.8.0_40′)’] {‘sudo’: True}
2015-11-11 09:15:13,253 – Package[‘spark_2_3_*’] {}
2015-11-11 09:15:13,317 – Installing package spark_2_3_* (‘/usr/bin/yum -d 0 -e 0 -y install ‘spark_2_3_*”)
Command aborted. Aborted by user

 


Viewing all articles
Browse latest Browse all 3435

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>