HDFS get
In Hadoop Distributed File System (HDFS), the get
command is used to copy files or directories from HDFS to the local file system of a machine in your Hadoop cluster. It allows you to retrieve data stored in HDFS and save it to a local directory for further analysis or processing. Here’s how to use the get
command:
hadoop fs -get <source> <destination>
<source>
: This is the HDFS path of the file or directory you want to retrieve.<destination>
: This is the local path where you want to save the file or directory from HDFS.
Here’s an example of how you can use the get
command to copy a file from HDFS to the local file system:
hadoop fs -get /user/hadoop/input/myfile.txt /home/user/local/
In this example, /user/hadoop/input/myfile.txt
is the source file in HDFS, and /home/user/local/
is the destination directory in the local file system where the file will be copied.
You can also use the -get
command to copy entire directories from HDFS to the local file system. For example:
hadoop fs -get /user/hadoop/input/data/ /home/user/local/
Hadoop Training Demo Day 1 Video:
Conclusion:
Unogeeks is the No.1 IT Training Institute for Hadoop Training. Anyone Disagree? Please drop in a comment
You can check out our other latest blogs on Hadoop Training here – Hadoop Blogs
Please check out our Best In Class Hadoop Training Details here – Hadoop Training
Follow & Connect with us:
———————————-
For Training inquiries:
Call/Whatsapp: +91 73960 33555
Mail us at: info@unogeeks.com
Our Website ➜ https://unogeeks.com
Follow us:
Instagram: https://www.instagram.com/unogeeks
Facebook:https://www.facebook.com/UnogeeksSoftwareTrainingInstitute
Twitter: https://twitter.com/unogeeks