Quantcast
Channel: Hortonworks » All Replies
Viewing all articles
Browse latest Browse all 3435

Decimal data type not supported in SPARK->HiveQL?

$
0
0

I’m running the Spark “technical preview” (http://hortonworks.com/hadoop-tutorial/using-apache-spark-hdp/) on a cluster, running HDP 2.1.7

I can perform basic hive queries via a HiveContext in pyspark or spark-shell, but when I query a table (simple select * from table) that has ‘decimal’ columns, I get an error like this:

14/12/04 11:32:55 INFO parse.ParseDriver: Parse Completed
java.lang.RuntimeException: Unsupported dataType: decimal(19,6)
at scala.sys.package$.error(package.scala:27)
at org.apache.spark.sql.hive.HiveMetastoreTypes$.toDataType(HiveMetastoreCatalog.scala:233)
at org.apache.spark.sql.hive.MetastoreRelation$SchemaAttribute.toAttribute(HiveMetastoreCatalog.scala:308)
at org.apache.spark.sql.hive.MetastoreRelation$$anonfun$9.apply(HiveMetastoreCatalog.scala:318)
at org.apache.spark.sql.hive.MetastoreRelation$$anonfun$9.apply(HiveMetastoreCatalog.scala:318)
…..

Is decimal not a supported data type for this scenario?

Can we not use SPARK with our application unless we change the data types?


Viewing all articles
Browse latest Browse all 3435

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>