Quantcast
Channel: Hortonworks » All Replies
Viewing all 3435 articles
Browse latest View live

Reply To: Unable to mount HDFS file system on remote Linux server

$
0
0

I am following the config guide you sent.
Unfortunately stuck at 1st step itself.

inside /var/lib/ambari-agent/cache/stacks/HDP I see multiple versions. And inside most versions there is ‘core-site.xml’ file.
Which version should I select?

&

The user running the NFS gateway must be able to proxy all of the users using the NFS mounts. For instance, if user “nfsserver” is running the gateway, and users belonging to the groups “nfs-users1″ and “nfs-users2″ use the NFS mounts, then in the core-site.xml file on the NameNode, the following must be set (NOTE: replace “nfsserver” with the user name starting the gateway in your cluster): = I am using root user. I don’t see any ‘nfsuser’ under /etc/passwd and no group for ‘nfs’. Should create nfs group and make root member of that group?

Pls suggest.

Thanks,
Amey.


Reply To: Unable to mount HDFS file system on remote Linux server

$
0
0

And I am not able to find ‘core-site.xml’ file under = /var/lib/ambari-agent/cache/stacks/HDP/2.1/services/HDFS/

Hbase Zookeepar connection

$
0
0

Hi ,
I had installed HDP in cluster but when I run mapreduce job to load data to Hbase It is giving me error . Please check this error.

INFO client.ZooKeeperRegistry: ClusterId read in ZooKeeper is null
Exception in thread “main” org.apache.hadoop.hbase.client.NoServerForRegionException: Unable to find region for after 35 tries.
at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegionInMeta(HConnectionManager.java:1134)
at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:1054)
at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:1011)
at org.apache.hadoop.hbase.client.HTable.finishSetup(HTable.java:326)
at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:192)
at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:150)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.util.RunJar.main(RunJar.java:212)

Reply To: Hive Mysql Server stopping without error message

$
0
0

hmm interesting. After finihing most of my tests i actually reseted ambari and installed the cluster and put the hive mysql server on another host. Guess what i got the same thing again. It starts and then without error message it stops after a few seconds. Now im gonna test if hive works anyways..

But i found this in my hiveserver2 log which is strange i think. I googled it and it says it was a bug which got fixed in 1.6.0 and im using 1.6.1 ….

2014-09-26 15:24:22,538 INFO [Thread-6]: thrift.ThriftCLIService (ThriftBinaryCLIService.java:run(88)) – ThriftBinaryCLIService listening on 0.0.0.0/0.0.0.0:10000
java.lang.RuntimeException: org.apache.thrift.transport.TTransportException
2014-09-26 15:24:28,836 ERROR [pool-5-thread-1]: server.TThreadPoolServer (TThreadPoolServer.java:run(215)) – Error occurred during processing of message.
at org.apache.thrift.transport.TSaslServerTransport$Factory.getTransport(TSaslServerTransport.java:219)
at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:189)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:744)
Caused by: org.apache.thrift.transport.TTransportException
at org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:132)
at org.apache.thrift.transport.TTransport.readAll(TTransport.java:84)
at org.apache.thrift.transport.TSaslTransport.receiveSaslMessage(TSaslTransport.java:178)
at org.apache.thrift.transport.TSaslServerTransport.handleSaslStartMessage(TSaslServerTransport.java:125)
at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:253)
at org.apache.thrift.transport.TSaslServerTransport.open(TSaslServerTransport.java:41)
at org.apache.thrift.transport.TSaslServerTransport$Factory.getTransport(TSaslServerTransport.java:216)
… 4 more

Edit edit the HCat metadata record with HUE?

$
0
0

Using Hortonworks Sandbox 2.1.

How do I edit the HCatalog Table Metadata with HUE? I want to add a definition of the data element in the “Comment” field displayed by HUE when viewing column metadata for a table in HUE.

Thank you.

path for HADOOP_STREAMING jar file

$
0
0

What is the correct folder location for the HADOOP_STREAMING jar file ?

my hadoop-streaming.jar file is in /usr/lib/hadoop-mapreduce and I am getting an error while trying to execute a simple mapreduce command using RStudio, rmr2 library ………….Please make sure that the env. variable HADOOP_STREAMING is set

problem in your life

Hortonworks Sandbox on Linux…

$
0
0

Is there any way to run this on Linux or is there some other Hadoop available that anyone knows of that does run on Linux?? Thanks!!


Reply To: Load xls file into Hive Table

$
0
0

Hi All , Same issue , when I uploaded XLS file using HCatalog -> Create table using Excel file -> XLS file
In Table preview it does not show NULL values but once create table is done. When I check it shows NULL values
in Excel only 500 Rows but it shows 6000 rows with so many NULL values though there is no NULL in excel file . Any body knows what should I do ?

Unable to download Hadoop Sandox

only i am use this way so

$
0
0

[URL=http://www.lolpro.com/forums/league-of-legends/tournaments/31060-chicago-bears-vs-green-bay-packers-live-st-r-eam]Chicago Bears vs Green Bay Packers live stream[URL]

[URL=http://www.lolpro.com/forums/league-of-legends/tournaments/31064-green-bay-packers-vs-chicago-bears-live-st-r-eam]Green Bay Packers vs Chicago Bears live stream[URL]

[URL=http://www.lolpro.com/forums/league-of-legends/tournaments/31068-buffalo-bills-vs-houston-texans-live-st-r-eam-nfl]Buffalo Bills vs Houston Texans live stream[URL]

[URL=http://www.lolpro.com/forums/league-of-legends/tournaments/31072-carolina-panthers-vs-baltimore-ravens-live-st-r]Carolina Panthers vs Baltimore Ravens live stream[URL]

[URL=http://www.lolpro.com/forums/league-of-legends/tournaments/31074-indianapolis-colts-vs-tennessee-titans-live-st-r]Indianapolis Colts vs Tennessee titans live stream[URL]

[URL=http://www.lolpro.com/forums/league-of-legends/tournaments/31079-miami-dolphins-vs-oakland-raiders-live-st-r-eam]Miami Dolphins vs Oakland Raiders Live Stream[URL]

[URL=http://www.lolpro.com/forums/league-of-legends/tournaments/31082-new-york-jets-vs-detroit-lions-live-st-r-eam-nfl]New York Jets vs Detroit Lions live stream[URL]

[URL=http://imbibe.com/forum/champagne-wine/chicago-bears-vs-green]Chicago Bears vs Green Bay Packers live stream[URL]

[URL=http://imbibe.com/forum/champagne-wine/green-bay-packers-vs-chicago_4]Green Bay Packers vs Chicago Bears live stream[URL]

[URL=http://imbibe.com/forum/champagne-wine/buffalo-bills-vs-houston]Buffalo Bills vs Houston Texans live stream[URL]

[URL=http://imbibe.com/forum/champagne-wine/carolina-panthers]Carolina Panthers vs Baltimore Ravens live stream[URL]

[URL=http://imbibe.com/forum/champagne-wine/indianapolis-colts-vs-tennessee]Indianapolis Colts vs Tennessee titans live stream[URL]

[URL=http://imbibe.com/forum/champagne-wine/miami-dolphins-vs-oakland]Miami Dolphins vs Oakland Raiders Live Stream[URL]

[URL=http://imbibe.com/forum/champagne-wine/new-york-jets-vs-detroit]New York Jets vs Detroit Lions live stream[URL]

[URL=http://imbibe.com/forum/champagne-wine/pittsburgh-steelers-vs]Pittsburgh Steelers v Tampa Bay Buccaneers Live Stream[URL]

Reply To: only i am use this way so

$
0
0

[URL=http://www.lolpro.com/forums/league-of-legends/tournaments/31060-chicago-bears-vs-green-bay-packers-live-st-r-eam]Chicago Bears vs Green Bay Packers live stream[/URL]

[URL=http://www.lolpro.com/forums/league-of-legends/tournaments/31064-green-bay-packers-vs-chicago-bears-live-st-r-eam]Green Bay Packers vs Chicago Bears live stream[/URL]

[URL=http://www.lolpro.com/forums/league-of-legends/tournaments/31068-buffalo-bills-vs-houston-texans-live-st-r-eam-nfl]Buffalo Bills vs Houston Texans live stream[/URL]

[URL=http://www.lolpro.com/forums/league-of-legends/tournaments/31072-carolina-panthers-vs-baltimore-ravens-live-st-r]Carolina Panthers vs Baltimore Ravens live stream[/URL]

[URL=http://www.lolpro.com/forums/league-of-legends/tournaments/31074-indianapolis-colts-vs-tennessee-titans-live-st-r]Indianapolis Colts vs Tennessee titans live stream[/URL]

[URL=http://www.lolpro.com/forums/league-of-legends/tournaments/31079-miami-dolphins-vs-oakland-raiders-live-st-r-eam]Miami Dolphins vs Oakland Raiders Live Stream[/URL]

[URL=http://www.lolpro.com/forums/league-of-legends/tournaments/31082-new-york-jets-vs-detroit-lions-live-st-r-eam-nfl]New York Jets vs Detroit Lions live stream[/URL]

[URL=http://imbibe.com/forum/champagne-wine/chicago-bears-vs-green]Chicago Bears vs Green Bay Packers live stream[/URL]

[URL=http://imbibe.com/forum/champagne-wine/green-bay-packers-vs-chicago_4]Green Bay Packers vs Chicago Bears live stream[/URL]

[URL=http://imbibe.com/forum/champagne-wine/buffalo-bills-vs-houston]Buffalo Bills vs Houston Texans live stream[/URL]

[URL=http://imbibe.com/forum/champagne-wine/carolina-panthers]Carolina Panthers vs Baltimore Ravens live stream[/URL]

[URL=http://imbibe.com/forum/champagne-wine/indianapolis-colts-vs-tennessee]Indianapolis Colts vs Tennessee titans live stream[/URL]

[URL=http://imbibe.com/forum/champagne-wine/miami-dolphins-vs-oakland]Miami Dolphins vs Oakland Raiders Live Stream[/URL]

[URL=http://imbibe.com/forum/champagne-wine/new-york-jets-vs-detroit]New York Jets vs Detroit Lions live stream[/URL]

[URL=http://imbibe.com/forum/champagne-wine/pittsburgh-steelers-vs]Pittsburgh Steelers v Tampa Bay Buccaneers Live Stream[/URL]

Reply To: Parallel Import fails going to Oracle RAC

$
0
0

No its not that simple. We have the right details. We do indeed get connections to the db using the IP. We can see the meta data connections pass thru just fine. The query that runs WHERE (1=2) or whatever it is to get the column mapping info. The problem happens when it goes back to do the actual work of getting the data.

We may have found a workaround .. using the physical IP address of one node (of 8) rather then using the rac IP’s or virtual IPs. (Oracle Exadata). Its not great since we are losing the load balancing aspects of RAC.

dfs.support.append

$
0
0

Hello
I am using the HDP Sandbox 2.1. (Linux)
I am trying to append data to a file.
[root@sandbox conf]# pwd
/etc/hadoop/conf

hdfs-site.xml value for this parameter seems to be true, this is the default value
<property>
<name>dfs.support.append</name>
<value>true</value>
</property>
but the following call always return false/false

if(this._hdfsFile == null)
this._hdfsFile = FileSystem.get(URI.create(this._directory), new Configuration());

boolean flag1 = Boolean.getBoolean(this._hdfsFile.getConf().get(“dfs.support.append”));
System.out.println(“dfs.support.append is set to be ” + flag1);

boolean flag2 = this._hdfsFile.getConf().getBoolean(“hdfs.append.support”, false);
System.out.println(“hdfs.append.support is set to be ” + flag2);

unless I set _hdfsFile.setReplication(path, (short)1);
I cannot append to the file.

What am I doing wrong ?

Can somebody help?
Thanks

Reply To: Hortonworks Connector for Teradata compatibility with Apache Hadoop 2.x

$
0
0

Hortonworks Hadoop version is mostly what is shipped in Apache with a few patches on it for issues that are in progress with patches posted onto the JIRA if any (although I have not tested it with plain Apache installation). TDCH has three versions availalbe, one for the HDP 1.3.x versions which is for Apache Hadoop 1 versions, one for HDP 2.x, which is for Hadoop 2.2.0 version and one for HDP 2.1 (which is based on Hadoop 2.4.0).

For the sqoop interface, you need to get the Sqoop connector that Hortonworks built on top of the TDCH. The documentation that ships with the Hortonworks Teradata connector describes how to implement incremental imports

Venkat


Reply To: Sandbox 2.1 ODBC Not Working

$
0
0

Resolved for me by adding new GRANT permission (as others said in this post):

1) Logged into Sandbox VM (ssh root@127.0.0.1 -p 2222)
2) Launched hive (hive)
3) Add new GRANT permission to hue (grant SELECT on table tweetsbi to user hue ;)
4) Re-tried access to tweetsbi table from Excel — WORKS!

Reply To: Sandbox 2.1 ODBC Not Working

$
0
0

Thanks for the responses, it looks like the issue is resolved.

Uninstall zookeeper on a node

$
0
0

I need to move my current zookeeper instances to my new nodes and I can do that with Ambari. But I don’t see an option to remove zookeeper from the old nodes which are used for other services. In my test environment I tried to uninstall the rpm (CentOS) and it also uninstalls some of the hadoop packages. What is the best way to uninstall the zookeeper service without removing other services/rpms?

Thanks in advance for all your assistance.

Reply To: Uninstall zookeeper on a node

$
0
0

Hi Elvin,

I haven’t tried it myself and tell to outcome. You can try rpm -e zookeeper-3.4.5.2.1.5.0-695.el6.noarch..

Reply To: Hortonworks Connector for Teradata compatibility with Apache Hadoop 2.x

$
0
0

Thanks a lot Venkat for the information.

Now I will use Apache Hadoop 2.4.0 and Hortonworks Connector for Teradata v1.3 for HDP2.1
I will refer the increment import section from the documentation.

Just for curiosity wanted to know whether the Sqoop ONLY command line options for incremental import be supported as well in this case or I need to follow the one mentioned in the documentation of Hortonworks Connector ?

-Nirmal

Viewing all 3435 articles
Browse latest View live




Latest Images