Quantcast
Channel: Hortonworks » All Replies
Viewing all articles
Browse latest Browse all 3435

Sqoop 1.4.5 Import from Hive 1.0.0 Transactional table using ORC format

$
0
0

Hi,

Specs:
=====
Hive 1.0.0
Sqoop 1.4.5
Hadoop 2.6.0

I created a HCat table with the following Transactions – Requirements:
-Needs to declare table as having Transaction Property
-Table must be in ORC format
-Tables must to be bucketed

hcat -e “create table if not exists test (id int, val string) partitioned by (yearofstart string,monthofstart string,dayodfatart string) clustered by (id) into 7 buckets stored as orc TBLPROPERTIES(‘transactional’=’true’);”

Table is created as:
hive> show create table test;
OK
CREATE TABLE test(
id int,
val string)
PARTITIONED BY (
yearofstart string,
monthofstart string,
dayodfatart string)
CLUSTERED BY (
id)
INTO 7 BUCKETS
ROW FORMAT SERDE
‘org.apache.hadoop.hive.ql.io.orc.OrcSerde’
STORED AS INPUTFORMAT
‘org.apache.hadoop.hive.ql.io.orc.OrcInputFormat’
OUTPUTFORMAT
‘org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat’
LOCATION
‘hdfs://xxxx-D912.xxxx.co.in:9000/user/hive/warehouse/test’
TBLPROPERTIES (
‘transactional’=’true’,
‘transient_lastDdlTime’=’1426168607′)
Time taken: 0.164 seconds, Fetched: 21 row(s)
hive>

Next I am trying to do a SQOOP import using:
sqoop import –connect jdbc:teradata://xxx.xxx.xxx.xxx/testdb –driver com.teradata.jdbc.TeraDriver –table test –username user1 –password pwd1 –hcatalog-table test –hcatalog-storage-stanza “stored as orc” -m 1

Getting this error:

15/03/12 19:33:22 ERROR tool.ImportTool: Encountered IOException running import job: org.apache.hive.hcatalog.common.HCatException : 2016 : Error operation not supported : Store into a partition with bucket definition from Pig/Mapreduce is not supported
at org.apache.hive.hcatalog.mapreduce.HCatOutputFormat.setOutput(HCatOutputFormat.java:109)
at org.apache.hive.hcatalog.mapreduce.HCatOutputFormat.setOutput(HCatOutputFormat.java:70)
at org.apache.sqoop.mapreduce.hcat.SqoopHCatUtilities.configureHCat(SqoopHCatUtilities.java:339)
at org.apache.sqoop.mapreduce.hcat.SqoopHCatUtilities.configureImportOutputFormat(SqoopHCatUtilities.java:753)
at org.apache.sqoop.mapreduce.ImportJobBase.configureOutputFormat(ImportJobBase.java:98)
at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:240)
at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:665)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:497)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:601)
at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)
at org.apache.sqoop.Sqoop.main(Sqoop.java:236)

Referring http://mail-archives.apache.org/mod_mbox/pig-user/201312.mbox/%3CF1CBA1B8-DF38-444C-BFD3-5DC42690ED12@hortonworks.com%3E

Does HCatalog does not respect bucketing(which is a prerequisite for Transactional tables in Hive 1.0.0)?

Thanks,
-Nirmal


Viewing all articles
Browse latest Browse all 3435

Trending Articles