Quantcast
Channel: Hortonworks » All Replies
Viewing all articles
Browse latest Browse all 3435

Sandox Pig Tutorial

$
0
0

Hi, I am running into the following error while trying out the introductory Pig Tutorial

ls: cannot access /hadoop/yarn/local/usercache/hue/appcache/application_1420342244376_0002/container_1420342244376_0002_01_000002/hive.tar.gz/hive/lib/slf4j-api-*.jar: No such file or directory

It would be really helpful if i could have a step by step explanation of how to resolve this. Note that i tried to implement the solution provided here

http://idavit.blogspot.mx/2014/12/como-no-morir-en-el-intento-primer.html

but the copyToLocal command returns that there is no such file /apps/webhcat/hive.tar.gz

when i do a hadoop fs -ls from the command line nothing is displayed. I do not know what is wrong.

The Hive part of the tutorial worked just fine. Also, here is my pig script

a = LOAD ‘default.stocks’ USING org.apache.hive.hcatalog.pig.HCatLoader();
b = group a BY stock_symbol;
c = group b all;
d = foreach c generate stock_symbol, AVG(c.stock_volume);
dump d;

i am also using -useHCatalog as a pig argument in the arguments field


Viewing all articles
Browse latest Browse all 3435

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>