Hey there.
I have HDP 2.2 installed via Ambari on my CentOS 6.5.
Now, I have a problem with my spark hive thriftserver. I startet the service with ./spark-class org.apache.spark.sql.hive.thriftserver.HiveThriftServer2
then it starts and listen on port 10000 but after about 10 seconds, i get following error message:
Exception in thread "pool-10-thread-1" java.lang.OutOfMemoryError: Java heap space
at org.apache.thrift.transport.TSaslTransport.receiveSaslMessage(TSaslTransport.java:181)
at org.apache.thrift.transport.TSaslServerTransport.handleSaslStartMessage(TSaslServerTransport.java:125)
at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:253)
at org.apache.thrift.transport.TSaslServerTransport.open(TSaslServerTransport.java:41)
at org.apache.thrift.transport.TSaslServerTransport$Factory.getTransport(TSaslServerTransport.java:216)
at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:189)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
I get this error-message every 20 seconds.
After this error-message, the port 10000 is still listening but if i connect with beeline to the thriftserver, i get some very confusing results:
Example: I have 2 tables in hive, let’s say test_1 and test_2. Then, if i try to find these tables with beeline show tables
, then i get no result. If i try to create a new table with beeline, it works. Then i can also find this table with beeline, but not with hive. I don’t have any clue how this can happen. I also have 2 databases in Hive but i can only find one in Beeline. I hope it’s because of the errormessage above.
Can anyone please help me?