Quantcast
Channel: Hortonworks » All Replies
Viewing all articles
Browse latest Browse all 3435

Failed with exception copyFiles

$
0
0

Hi everyone,

I created a Hive table with ORC files, which I feed with INSERT INTO TABLE ... FROM SELECT ... queries.

Since last week, the query fails with this message : Failed with exception copyFiles

I double checked, and this is not a permission issue.
Does anyone have an other idea on what can cause this exception?
Here is the complete stack, I can’t see any extra information in it.

Error: java.io.IOException: File copy failed: hdfs://savid-bigdata1.tecteo.intra:8020/user/hive/tmp/hive/0e77f03a-f491-4f05-88e4-7788d3f91243/hive_2015-05-04_16-16-52_140_7224958539274766118-1/-ext-10000/date=20150421/000000_0 --> hdfs://<SERVER_URL>:8020/apps/hive/warehouse/seram_offload.db/ds_frequency/date=20150421/000000_0
at org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:284)
at org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:252)
at org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:50)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
Caused by: java.io.IOException: Couldn't run retriable-command: Copying hdfs://savid-bigdata1.tecteo.intra:8020/user/hive/tmp/hive/0e77f03a-f491-4f05-88e4-7788d3f91243/hive_2015-05-04_16-16-52_140_7224958539274766118-1/-ext-10000/date=20150421/000000_0 to hdfs://<SERVER_URL>:8020/apps/hive/warehouse/seram_offload.db/ds_frequency/date=20150421/000000_0
at org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:101)
at org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:280)
... 10 more
Caused by: java.io.IOException: Check-sum mismatch between hdfs://savid-bigdata1.tecteo.intra:8020/user/hive/tmp/hive/0e77f03a-f491-4f05-88e4-7788d3f91243/hive_2015-05-04_16-16-52_140_7224958539274766118-1/-ext-10000/date=20150421/000000_0 and hdfs://savid-bigdata1.tecteo.intra:8020/apps/hive/warehouse/seram_offload.db/ds_frequency/date=20150421/.distcp.tmp.attempt_1430743058863_0006_m_000000_0. Source and target differ in block-size. Use -pb to preserve block-sizes during copy. Alternatively, skip checksum-checks altogether, using -skipCrc. (NOTE: By skipping checksums, one runs the risk of masking data-corruption during file-transfer.)
at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.compareCheckSums(RetriableFileCopyCommand.java:211)
at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doCopy(RetriableFileCopyCommand.java:131)
at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doExecute(RetriableFileCopyCommand.java:100)
at org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:87)
... 11 more

When I do a Insert INTO ... FROM SELECT ... LIMIT 3 it’s working, but not with more data (without the LIMIT).

Finally, I juste enabled namenode HA. Could this be linked?

Thank you in advance guys.

Cheers,
Orlando


Viewing all articles
Browse latest Browse all 3435

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>