<pre class=”stderr”>Traceback (most recent call last):
File “/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive_server.py”, line 185, in <module>
HiveServer().execute()
File “/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py”, line 218, in execute
method(env)
File “/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive_server.py”, line 83, in start
self.configure(env) # FOR SECURITY
File “/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive_server.py”, line 54, in configure
hive(name=’hiveserver2′)
File “/usr/lib/python2.6/site-packages/ambari_commons/os_family_impl.py”, line 89, in thunk
return fn(*args, **kwargs)
File “/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive.py”, line 127, in hive
mode=params.webhcat_hdfs_user_mode
File “/usr/lib/python2.6/site-packages/resource_management/core/base.py”, line 154, in __init__
self.env.run()
File “/usr/lib/python2.6/site-packages/resource_management/core/environment.py”, line 152, in run
self.run_action(resource, action)
File “/usr/lib/python2.6/site-packages/resource_management/core/environment.py”, line 118, in run_action
provider_action()
File “/usr/lib/python2.6/site-packages/resource_management/libraries/providers/hdfs_resource.py”, line 391, in action_create_on_execute
self.action_delayed(“create”)
File “/usr/lib/python2.6/site-packages/resource_management/libraries/providers/hdfs_resource.py”, line 388, in action_delayed
self.get_hdfs_resource_executor().action_delayed(action_name, self)
File “/usr/lib/python2.6/site-packages/resource_management/libraries/providers/hdfs_resource.py”, line 244, in action_delayed
self._assert_valid()
File “/usr/lib/python2.6/site-packages/resource_management/libraries/providers/hdfs_resource.py”, line 228, in _assert_valid
self.target_status = self._get_file_status(target)
File “/usr/lib/python2.6/site-packages/resource_management/libraries/providers/hdfs_resource.py”, line 284, in _get_file_status
list_status = self.util.run_command(target, ‘GETFILESTATUS’, method=’GET’, ignore_status_codes=[‘404’], assertable_result=False)
File “/usr/lib/python2.6/site-packages/resource_management/libraries/providers/hdfs_resource.py”, line 189, in run_command
_, out, err = get_user_call_output(cmd, user=self.run_user, logoutput=self.logoutput, quiet=False)
File “/usr/lib/python2.6/site-packages/resource_management/libraries/functions/get_user_call_output.py”, line 49, in get_user_call_output
func_result = func(shell.as_user(command_string, user), **call_kwargs)
File “/usr/lib/python2.6/site-packages/resource_management/core/shell.py”, line 70, in inner
result = function(command, **kwargs)
File “/usr/lib/python2.6/site-packages/resource_management/core/shell.py”, line 92, in checked_call
tries=tries, try_sleep=try_sleep)
File “/usr/lib/python2.6/site-packages/resource_management/core/shell.py”, line 140, in _call_wrapper
result = _call(command, **kwargs_copy)
File “/usr/lib/python2.6/site-packages/resource_management/core/shell.py”, line 291, in _call
raise Fail(err_msg)
resource_management.core.exceptions.Fail: Execution of ‘ambari-sudo.sh su hdfs -l -s /bin/bash -c ‘curl -sS -L -w ‘”‘”‘%{http_code}'”‘”‘ -X GET ‘”‘”‘http://namenode1.internal.dezyre.com:50070/webhdfs/v1/user/hcat?op=GETFILESTATUS&user.name=hdfs'”‘”‘ 1>/tmp/tmpLqxvZK 2>/tmp/tmpDnkSA1” returned 7.
<h5><span id=”i18n-187″>stdout</span>: <span class=”muted”> /var/lib/ambari-agent/data/output-1107.txt </span></h5>
<pre class=”stdout”>2015-10-14 12:45:25,375 – Directory[‘/var/lib/ambari-agent/data/tmp/AMBARI-artifacts/’] {‘recursive’: True}
2015-10-14 12:45:25,378 – File[‘/var/lib/ambari-agent/data/tmp/AMBARI-artifacts//jce_policy-8.zip’] {‘content’: DownloadSource(‘http://gateway1.internal.dezyre.com:8080/resources//jce_policy-8.zip’)}
2015-10-14 12:45:25,378 – Not downloading the file from http://gateway1.internal.dezyre.com:8080/resources//jce_policy-8.zip, because /var/lib/ambari-agent/data/tmp/jce_policy-8.zip already exists
2015-10-14 12:45:25,378 – Group[‘hadoop’] {‘ignore_failures’: False}
2015-10-14 12:45:25,380 – Group[‘users’] {‘ignore_failures’: False}
2015-10-14 12:45:25,380 – User[‘hive’] {‘gid’: ‘hadoop’, ‘ignore_failures’: False, ‘groups’: [u’hadoop’]}
2015-10-14 12:45:25,381 – User[‘zookeeper’] {‘gid’: ‘hadoop’, ‘ignore_failures’: False, ‘groups’: [u’hadoop’]}
2015-10-14 12:45:25,381 – User[‘ams’] {‘gid’: ‘hadoop’, ‘ignore_failures’: False, ‘groups’: [u’hadoop’]}
2015-10-14 12:45:25,382 – User[‘ambari-qa’] {‘gid’: ‘hadoop’, ‘ignore_failures’: False, ‘groups’: [u’users’]}
2015-10-14 12:45:25,383 – User[‘flume’] {‘gid’: ‘hadoop’, ‘ignore_failures’: False, ‘groups’: [u’hadoop’]}
2015-10-14 12:45:25,383 – User[‘tez’] {‘gid’: ‘hadoop’, ‘ignore_failures’: False, ‘groups’: [u’users’]}
2015-10-14 12:45:25,384 – User[‘hdfs’] {‘gid’: ‘hadoop’, ‘ignore_failures’: False, ‘groups’: [u’hadoop’]}
2015-10-14 12:45:25,385 – User[‘sqoop’] {‘gid’: ‘hadoop’, ‘ignore_failures’: False, ‘groups’: [u’hadoop’]}
2015-10-14 12:45:25,385 – User[‘yarn’] {‘gid’: ‘hadoop’, ‘ignore_failures’: False, ‘groups’: [u’hadoop’]}
2015-10-14 12:45:25,386 – User[‘hcat’] {‘gid’: ‘hadoop’, ‘ignore_failures’: False, ‘groups’: [u’hadoop’]}
2015-10-14 12:45:25,387 – User[‘mapred’] {‘gid’: ‘hadoop’, ‘ignore_failures’: False, ‘groups’: [u’hadoop’]}
2015-10-14 12:45:25,389 – File[‘/var/lib/ambari-agent/data/tmp/changeUid.sh’] {‘content’: StaticFile(‘changeToSecureUid.sh’), ‘mode’: 0555}
2015-10-14 12:45:25,390 – Execute[‘/var/lib/ambari-agent/data/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa’] {‘not_if’: ‘(test $(id -u ambari-qa) -gt 1000) || (false)’}
2015-10-14 12:45:25,415 – Skipping Execute[‘/var/lib/ambari-agent/data/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa’] due to not_if
2015-10-14 12:45:25,416 – Group[‘hdfs’] {‘ignore_failures’: False}
2015-10-14 12:45:25,417 – User[‘hdfs’] {‘ignore_failures’: False, ‘groups’: [u’hadoop’, u’hdfs’]}
2015-10-14 12:45:25,417 – Directory[‘/etc/hadoop’] {‘mode’: 0755}
2015-10-14 12:45:25,438 – File[‘/usr/hdp/current/hadoop-client/conf/hadoop-env.sh’] {‘content’: InlineTemplate(…), ‘owner’: ‘hdfs’, ‘group’: ‘hadoop’}
2015-10-14 12:45:25,454 – Execute[(‘setenforce’, ‘0’)] {‘not_if’: ‘(! which getenforce ) || (which getenforce && getenforce | grep -q Disabled)’, ‘sudo’: True, ‘only_if’: ‘test -f /selinux/enforce’}
2015-10-14 12:45:25,484 – Skipping Execute[(‘setenforce’, ‘0’)] due to not_if
2015-10-14 12:45:25,485 – Directory[‘/var/log/hadoop’] {‘owner’: ‘root’, ‘mode’: 0775, ‘group’: ‘hadoop’, ‘recursive’: True, ‘cd_access’: ‘a’}
2015-10-14 12:45:25,489 – Directory[‘/var/run/hadoop’] {‘owner’: ‘root’, ‘group’: ‘root’, ‘recursive’: True, ‘cd_access’: ‘a’}
2015-10-14 12:45:25,490 – Directory[‘/tmp/hadoop-hdfs’] {‘owner’: ‘hdfs’, ‘recursive’: True, ‘cd_access’: ‘a’}
2015-10-14 12:45:25,499 – File[‘/usr/hdp/current/hadoop-client/conf/commons-logging.properties’] {‘content’: Template(‘commons-logging.properties.j2’), ‘owner’: ‘hdfs’}
2015-10-14 12:45:25,502 – File[‘/usr/hdp/current/hadoop-client/conf/health_check’] {‘content’: Template(‘health_check.j2’), ‘owner’: ‘hdfs’}
2015-10-14 12:45:25,503 – File[‘/usr/hdp/current/hadoop-client/conf/log4j.properties’] {‘content’: …, ‘owner’: ‘hdfs’, ‘group’: ‘hadoop’, ‘mode’: 0644}
2015-10-14 12:45:25,514 – File[‘/usr/hdp/current/hadoop-client/conf/hadoop-metrics2.properties’] {‘content’: Template(‘hadoop-metrics2.properties.j2’), ‘owner’: ‘hdfs’}
2015-10-14 12:45:25,515 – File[‘/usr/hdp/current/hadoop-client/conf/task-log4j.properties’] {‘content’: StaticFile(‘task-log4j.properties’), ‘mode’: 0755}
2015-10-14 12:45:25,516 – File[‘/usr/hdp/current/hadoop-client/conf/configuration.xsl’] {‘owner’: ‘hdfs’, ‘group’: ‘hadoop’}
2015-10-14 12:45:25,523 – File[‘/etc/hadoop/conf/topology_mappings.data’] {‘owner’: ‘hdfs’, ‘content’: Template(‘topology_mappings.data.j2’), ‘only_if’: ‘test -d /etc/hadoop/conf’, ‘group’: ‘hadoop’}
2015-10-14 12:45:25,545 – File[‘/etc/hadoop/conf/topology_script.py’] {‘content’: StaticFile(‘topology_script.py’), ‘only_if’: ‘test -d /etc/hadoop/conf’, ‘mode’: 0755}
2015-10-14 12:45:25,910 – HdfsResource[‘/user/hcat’] {‘security_enabled’: False, ‘hadoop_bin_dir’: ‘/usr/hdp/current/hadoop-client/bin’, ‘keytab’: [EMPTY], ‘default_fs’: ‘hdfs://namenode1.internal.dezyre.com:8020’, ‘hdfs_site’: …, ‘kinit_path_local’: ‘kinit’, ‘principal_name’: ‘missing_principal’, ‘user’: ‘hdfs’, ‘owner’: ‘hcat’, ‘hadoop_conf_dir’: ‘/usr/hdp/current/hadoop-client/conf’, ‘type’: ‘directory’, ‘action’: [‘create_on_execute’], ‘mode’: 0755}
2015-10-14 12:45:25,913 – checked_call[‘ambari-sudo.sh su hdfs -l -s /bin/bash -c ‘curl -sS -L -w ‘”‘”‘%{http_code}'”‘”‘ -X GET ‘”‘”‘http://namenode1.internal.dezyre.com:50070/webhdfs/v1/user/hcat?op=GETFILESTATUS&user.name=hdfs'”‘”‘ 1>/tmp/tmpLqxvZK 2>/tmp/tmpDnkSA1”] {‘logoutput’: None, ‘quiet’: False}
↧
Error in Running Hive Server in Ambari
↧