4

I want to upload and download file in hadoop. and want to store file in server or multi-node cluster.

user11236
  • 41
  • 1
  • 1
  • 2

2 Answers2

5

for putting files on hadoop use

 hadoop fs -put /<local machime path> /<hdfs path>

and to get files from hadoop use

 hadoop fs -get /<hdfs path> /<local machime path>     

for more information see this

Vikas Hardia
  • 833
  • 1
  • 9
  • 12
  • Provided link is broken. Nice example why it is recommended to post at least a tl;dr version of the link. – Dror Jul 14 '15 at 15:19
  • @Dror link is updated now thanks for the feedback – Vikas Hardia Jul 20 '15 at 14:15
  • 1
    This is the same link as in the answer, but linked to "current" instead of a specific version "r2.7.0": http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/FileSystemShell.html – michael Aug 20 '16 at 12:53
1

To copy from local to hdfs, use this command:

hadoop fs -copyFromLocal /temporaryfile.txt hdfs://DFCMSUSEHDSL03.pbi.global.pvt/sunil/temporaryfile.txt

OR

hadoop fs -copyFromLocal /temporaryfile.txt hdfs://DFCMSUSEHDSL03.pbi.global.pvt:8020/sunil/temporaryfile.txt

OR

hadoop fs -put /temporaryfile.txt /sunil/temporaryfile.txt
Jacob Vlijm
  • 83,767
user271418
  • 119
  • 1