Hello Himanshu,
In the example you gave, the hadoop fs -cp
command interprets both source and destination paths as belonging to the “default file system”. The default file system is configured in core-site.xml as property fs.defaultFS
, and in most installations, this will be a URL pointing to an HDFS cluster. I assume there is no /path/file.txt in the HDFS cluster, so it results in a file not found error.
If you want to execute the command for a local source path, then there are 2 options. One is to use copyFromLocal
instead of cp
. This will treat the source path as belonging to the local file system.
hadoop fs -copyFromLocal /path/file.txt /tmp
The other option is to continue using cp
, but use an explicit file: URL to indicate that the source is coming from the local file system.
hadoop fs -cp file:/path/file.txt /tmp
I hope this helps.