Hello Experts,
We are upgrading HDP from 2.0 to 2.1 version and facing problem with running Oozie Hive action. The script has simple insert operation and works fine through hive shell or hive -f command. But when I run the same through oozie workflow i get the following error :
4991 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer – Get metadata for source tables
5103 [main] ERROR org.apache.hadoop.hive.ql.Driver – FAILED: IllegalArgumentException Error: ‘,’, ‘:’, or ‘;’ expected at position 33 from ‘varchar(100):varchar(100):decimal(15,5):timestamp:int:varchar(20):varchar(50):varchar(20):varchar(20):varchar(100)’ [0:varchar, 7:(, 8:100, 11:), 12::, 13:varchar, 20:(, 21:100, 24:), 25::, 26:decimal, 33:(, 34:15, 36:,, 37:5, 38:), 39::, 40:timestamp, 49::, 50:int, 53::, 54:varchar, 61:(, 62:20, 64:), 65::, 66:varchar, 73:(, 74:50, 76:), 77::, 78:varchar, 85:(, 86:20, 88:), 89::, 90:varchar, 97:(, 98:20, 100:), 101::, 102:varchar, 109:(, 110:100, 113:)]
java.lang.IllegalArgumentException: Error: ‘,’, ‘:’, or ‘;’ expected at position 33 from ‘varchar(100):varchar(100):decimal(15,5):timestamp:int:varchar(20):varchar(50):varchar(20):varchar(20):varchar(100)’ [0:varchar, 7:(, 8:100, 11:), 12::, 13:varchar, 20:(, 21:100, 24:), 25::, 26:decimal, 33:(, 34:15, 36:,, 37:5, 38:), 39::, 40:timestamp, 49::, 50:int, 53::, 54:varchar, 61:(, 62:20, 64:), 65::, 66:varchar, 73:(, 74:50, 76:), 77::, 78:varchar, 85:(, 86:20, 88:), 89::, 90:varchar, 97:(, 98:20, 100:), 101::, 102:varchar, 109:(, 110:100, 113:)]
at org.apache.hadoop.hive.serde2.typeinfo.TypeInfoUtils$TypeInfoParser.parseTypeInfos(TypeInfoUtils.java:312)
at org.apache.hadoop.hive.serde2.typeinfo.TypeInfoUtils.getTypeInfosFromTypeString(TypeInfoUtils.java:716)
at org.apache.hadoop.hive.serde2.lazy.LazyUtils.extractColumnInfo(LazyUtils.java:364)
at org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe.initSerdeParams(LazySimpleSerDe.java:288)
at org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe.initialize(LazySimpleSerDe.java:187)
at org.apache.hadoop.hive.metastore.MetaStoreUtils.getDeserializer(MetaStoreUtils.java:218)
at org.apache.hadoop.hive.ql.metadata.Table.getDeserializerFromMetaStore(Table.java:272)
at org.apache.hadoop.hive.ql.metadata.Table.checkValidity(Table.java:175)
at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:991)
Is it because of changes to the Hive datatype (Decimal) ? is there any work around. Please suggest as we are stuck with production upgrade.
Thanks
Gaurav