Quantcast
Channel: Hortonworks » All Replies
Viewing all articles
Browse latest Browse all 3435

Storm-kafka Hortonworks Tutorials for real time data streaming

$
0
0

Welcome any idea after read the problem statement.

Background: Publish message using Apache Kafka: Kafka broker is running. Kafka producers are the application that create the messages and publish them to the Kafka broker for further consumption. Therefore, in order for the Kafka consumer to consume data, Kafka topic need to create before Kafka producer and consumer starting publish message and consume message

Kafka tested successful as Kafka consumer able to consume data from Kafka topic and display result.

Before startup Storm topology, stop the Kafka consumer so that Storm Spout able to working on source of data streams from kafka topics.

Real time processing of the data using Apache Storm: With Storm topology created, Storm Spout working on the source of data streams, which mean Spout will read data from kafka topics. At another end, Spout passes streams of data to Storm Bolt, which processes and create the data into HDFS (file format) and HBase (db format) for storage purpose.

Zookeeper znode missing the last child znode. From log file, 2015-05-20 04:22:43 b.s.util [ERROR] Async loop died! java.lang.RuntimeException: java.lang.RuntimeException: org.apache.zookeeper.KeeperException$NoNodeException: KeeperErrorCode = NoNode for /brokers/topics/truckevent/partitions
Zookeeper is the coordination service for distribution application. From the zookeeper client, we always can see the /brokers/topics/truckevent, but the last znode always missing when running storm. I managed to solve this issue once if we create the znode manually. However, same method is no longer work for subsequent testing.

Storm (TrucjHBaseBolt is the java class) failed to access connection to HBase tables. From log file, 2015-05-20 04:22:51 c.h.t.t.TruckHBaseBolt [ERROR] Error retrievinging connection and access to HBase Tables
I had manually create the Hbase table as for data format at HBase. However, retrieving the connection to HBase still failed.

Storm (HdfsBolt java class) reported the permission denied when storm user write the data into hdfs. From log file, 2015-05-20 04:22:43 b.s.util [ERROR] Async loop died! java.lang.RuntimeException: Error preparing HdfsBolt: Permission denied: user=storm, access=WRITE, inode=”/”:hdfs:hdfs:drwxr-xr-x
Any one can help on this?


Viewing all articles
Browse latest Browse all 3435

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>