Quantcast
Viewing all articles
Browse latest Browse all 3435

History Server and HiveServer2 will not start

I am running HDP version 2.3 with Ambari 2.1. I have 6 servers that are running CentOS 7.1. I have everything installed but when I try to start the History Server and the HiveServer2 I get an almost identical error message:

resource_management.core.exceptions.Fail: Execution of ‘ambari-sudo.sh su hdfs -l -s /bin/bash -c ‘curl -sS -L -w ‘”‘”‘%{http_code}'”‘”‘ -X PUT -T /usr/hdp/2.3.0.0-2557/hadoop/mapreduce.tar.gz ‘”‘”‘http://cshadoop.boisestate.edu:50070/webhdfs/v1/hdp/apps/2.3.0.0-2557/mapreduce/mapreduce.tar.gz?op=CREATE&user.name=hdfs&overwrite=True&permission=444′”‘”‘ 1>/tmp/tmpgSzsFx 2>/tmp/tmpwZ_9y9” returned 55.

The only difference between the two is the name of the files in /tmp. I am fairly new to this so please let me know what additional information would be useful.

Below is the full dump from the History Server error:

Traceback (most recent call last):
File “/var/lib/ambari-agent/cache/common-services/YARN/2.1.0.2.0/package/scripts/historyserver.py”, line 168, in <module>
HistoryServer().execute()
File “/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py”, line 218, in execute
method(env)
File “/var/lib/ambari-agent/cache/common-services/YARN/2.1.0.2.0/package/scripts/historyserver.py”, line 95, in start
resource_created = copy_to_hdfs(“mapreduce”, params.user_group, params.hdfs_user)
File “/usr/lib/python2.6/site-packages/resource_management/libraries/functions/copy_tarball.py”, line 193, in copy_to_hdfs
mode=0444
File “/usr/lib/python2.6/site-packages/resource_management/core/base.py”, line 154, in __init__
self.env.run()
File “/usr/lib/python2.6/site-packages/resource_management/core/environment.py”, line 152, in run
self.run_action(resource, action)
File “/usr/lib/python2.6/site-packages/resource_management/core/environment.py”, line 118, in run_action
provider_action()
File “/usr/lib/python2.6/site-packages/resource_management/libraries/providers/hdfs_resource.py”, line 391, in action_create_on_execute
self.action_delayed(“create”)
File “/usr/lib/python2.6/site-packages/resource_management/libraries/providers/hdfs_resource.py”, line 388, in action_delayed
self.get_hdfs_resource_executor().action_delayed(action_name, self)
File “/usr/lib/python2.6/site-packages/resource_management/libraries/providers/hdfs_resource.py”, line 247, in action_delayed
self._create_resource()
File “/usr/lib/python2.6/site-packages/resource_management/libraries/providers/hdfs_resource.py”, line 261, in _create_resource
self._create_file(self.main_resource.resource.target, source=self.main_resource.resource.source, mode=self.mode)
File “/usr/lib/python2.6/site-packages/resource_management/libraries/providers/hdfs_resource.py”, line 311, in _create_file
self.util.run_command(target, ‘CREATE’, method=’PUT’, overwrite=True, assertable_result=False, file_to_put=source, **kwargs)
File “/usr/lib/python2.6/site-packages/resource_management/libraries/providers/hdfs_resource.py”, line 189, in run_command
_, out, err = get_user_call_output(cmd, user=self.run_user, logoutput=self.logoutput, quiet=False)
File “/usr/lib/python2.6/site-packages/resource_management/libraries/functions/get_user_call_output.py”, line 49, in get_user_call_output
func_result = func(shell.as_user(command_string, user), **call_kwargs)
File “/usr/lib/python2.6/site-packages/resource_management/core/shell.py”, line 70, in inner
result = function(command, **kwargs)
File “/usr/lib/python2.6/site-packages/resource_management/core/shell.py”, line 92, in checked_call
tries=tries, try_sleep=try_sleep)
File “/usr/lib/python2.6/site-packages/resource_management/core/shell.py”, line 140, in _call_wrapper
result = _call(command, **kwargs_copy)
File “/usr/lib/python2.6/site-packages/resource_management/core/shell.py”, line 291, in _call
raise Fail(err_msg)
resource_management.core.exceptions.Fail: Execution of ‘ambari-sudo.sh su hdfs -l -s /bin/bash -c ‘curl -sS -L -w ‘”‘”‘%{http_code}'”‘”‘ -X PUT -T /usr/hdp/2.3.0.0-2557/hadoop/mapreduce.tar.gz ‘”‘”‘http://cshadoop.boisestate.edu:50070/webhdfs/v1/hdp/apps/2.3.0.0-2557/mapreduce/mapreduce.tar.gz?op=CREATE&user.name=hdfs&overwrite=True&permission=444′”‘”‘ 1>/tmp/tmpironHk 2>/tmp/tmpscaJPq” returned 55.


Viewing all articles
Browse latest Browse all 3435

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>