Reply To: Upgrade Hive to 0.13.1 ?
Yes, I’ve worked around it. I ultimately had to implement the FileSplit and associated interfaces anyway due to other Hive dependencies. While I’ve not yet deployed to the sandbox, I have successfully...
View ArticleReply To: Install/Startup Guide for VirtualBox
hi, i just installed VirtualBox and started the sandbox. I can also access 127.0.0.1:8888 but accessing the sandbox hdp web interface via http://127.0.0.1:8000 fails with “connection failed / Fehler:...
View ArticleReply To: HDP 2.1.3 Accumulo fails to install on Ubuntu 14.04
Just to share it with others, here is the workaround steps: 1) wget http://launchpadlibrarian.net/36005381/chkconfig_11.0-79.1-2_all.deb 2) dpkg –install chkconfig-11.0-79.1 3) install accumulo
View ArticleBrowser times out
I have installed the Sandbox into VMPlayer. It appears to boot and I can connect to the VM with putty. When I try to connect with the Firefox browser by using the “http://192.168.136.128″, the browser...
View ArticleReply To: How can I install HDP on Ubuntu?
look at ubuntu Juju deployment of hdp 2.1.x stack on Ubuntu 14.04 or 12.04. few examples: https://jujucharms.com/trusty/hdp-zookeeper-1/...
View ArticleBrowser Connection Reset
Installed VMWare Player, set Network Adapter to Host-Only, Device status is connected/connect at power-on. Tried three different browsers using the provided IP address http://192.168.85.128. Also...
View ArticleHBase not running
It appears hbase isn’t running. From Ambari: sandbox:60010 returns “This page can’t be displayed” HBase error: HBase Master process down Connection refused Percent RegionServers down I’ve looked at...
View ArticleReply To: Browser Connection Reset
Here is the solution that worked for me. Used VMWare Network Editor. Used the button at the bottom left to “restore default”. Had to power off the vm. When I powered the VM back up, the IP address...
View ArticleReply To: Hive Mysql Server stopping without error message
I used Ambari to setup a Single-Node Cluster, so everything on the same host ^^. I already made a huge test with Hive including the interface and a load test with about 100 gig data. I can’t see any...
View ArticleReply To: ClusterID mismatch for namenode and datanodes in fully distributed...
If you’re testing out one of the examples or somehow accidentally have a clusterID that is different and have not set a specific directory to store the data. Hadoop by default puts the data and...
View Articlemultiple networks problem
Hello, I currently have a 7 node cluster running with Ambari successfully. The only hiccup I have ran into is the network. I have all 7 nodes connected to a 10GB/s switch with the IP in the 10.0.0.0...
View ArticleReply To: multiple networks problem
Also my /etc/hosts file has all the hostnames typed down with the respective 10 network IP.
View ArticleHue 2.3.1-695 Internal error processing fetch
Hello, I have installed a new cluster with HDP 2.1.5 . All works fine excepting every now and then when I click on a table in HUE beeswax to view the structure I sometimes get an error “Internal error...
View ArticleMahout Log4j Configuration
Hi, I am unable to configure log4j for Mahout jobs. I tried to place log4j.properties file in mahout-0.9.0.2.1.1.0-1621\bin folder, mahout-0.9.0.2.1.1.0-1621\src\conf folder and root folder of Mahout....
View ArticleName node out of space: need forced rebalance or disk quota
My HDFS set-up: I have one server (call it server_home), which is used to interface with HDFS via the name node machine (call it server0). Server0 is also a data node. Whenever I upload data into HDFS...
View Articlewhat is the daemon for a passive name node
I have installed hadoop 2.4.1. When I start HDFS and see the daemons running it shows NameNode, SecondaryNameNode DataNode Is the daemon SecondaryNameNode refer to the passive node which is used for...
View Articlehow to convert bytearray to an object in HBase
I’m learning HBase by practising. I got how to convert a byte array from a customWritableObject. I have created a byte array using byte[] byteArray = WritableUtils.toByteArray(customWritableObject);...
View Articlehadoop-daemon.sh start datanode
Hello everyone, I install hadoop cluster in “Manual RPMS” way. But when I start the cluster, can only use the “hadoop-daemon.sh start datanode” one by one. Is there a script can start batch? And the...
View Article