Okay, I fixed the permission denied issue by using the hostname command. I had vagrant provision the names but they were wrong so now hostname and /etc/hosts match up for what the hostname command is reporting. That got rid of the permission denied. Using the same steps above from my first post I get the same “Warnings Encountered” but now it’s a completely different problem.
...
Fail: Execution of 'hadoop --config /etc/hadoop/conf fs -rm /tmp/id007f0100_date020814; hadoop fs -put /etc/passwd /tmp/id007f0100_date020814' returned 1. rm: `/tmp/id007f0100_date020814': No such file or directory
14/09/08 13:04:06 WARN hdfs.DFSClient: DataStreamer Exception
org.apache.hadoop.ipc.RemoteException(java.io.IOException): File /tmp/id007f0100_date020814._COPYING_ could only be replicated to 0 nodes instead of minReplication (=1). There are 0 datanode(s) running and no node(s) are excluded in this operation.
...