Quantcast
Channel: Hortonworks » All Replies
Viewing all articles
Browse latest Browse all 3435

ERROR while loading BigDecimal column from hive to HDFS using Pig

$
0
0

I am getting error while storing the information in Avro using Pig script. Is there a way to circumvent this problem.

I am reading from Hive table  which multiple BigDecimal columns

raw_data = LOAD ‘db1.tab1’ USING org.apache.hive.hcatalog.pig.HCatLoader();
selected_column_rec  = FOREACH raw_data GENERATE id, f_date;

STORE selected_column_rec INTO ‘/xyz/abcc/ll’ USING org.apache.pig.piggybank.storage.avro.AvroStorage('{“schema”:{“type”: “record”,”name”: “selected_column_rec”,”fields”:[{“name”: “id”,”type”: “int”},{“name”: “f_date”,”type”: “string”}]}}’);

Error message
Message: org.apache.pig.backend.executionengine.ExecException: ERROR 2118: serious problem
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigInputFormat.getSplits(PigInputFormat.java:288)
at org.apache.hadoop.mapreduce.JobSubmitter.writeNewSplits(JobSubmitter.java:493)
at org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:510)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:394)
370 [ORC_GET_SPLITS #7] ERROR org.apache.hadoop.hive.ql.io.orc.OrcInputFormat – 
Unexpected Exceptionjava.lang.NullPointerException at org.apache.hadoop.hive.ql.io.orc.OrcInputFormat.setIncludedColumns(OrcInputFormat.java:251)
at org.apache.hadoop.hive.ql.io.orc.OrcInputFormat$SplitGenerator.run(OrcInputFormat.java:756)

Viewing all articles
Browse latest Browse all 3435

Trending Articles