Hadoop Copy to Local
In Hadoop, you can use the hadoop fs -copyToLocal
command to copy files or directories from HDFS (Hadoop Distributed File System) to your local file system. This command is particularly useful when you want to retrieve data from HDFS and work with it on your local machine. Here’s how you can use it:
Syntax:
hadoop fs -copyToLocal <source> <destination>
<source>
: The HDFS path of the file or directory you want to copy.<destination>
: The local file system path where you want to copy the data.
Here’s an example of how to use the hadoop fs -copyToLocal
command:
1:To copy a file from HDFS to your local file system:
hadoop fs -copyToLocal /user/hadoop/input/sample.txt /home/user/local/sample.txt
In this example, it copies the file sample.txt
from the HDFS path /user/hadoop/input/
to the local path /home/user/local/
.
To copy a directory from HDFS to your local file system:
sqlhadoop fs -copyToLocal /user/hadoop/input/data /home/user/local/
In this example, it copies the entire
data
directory and its contents from HDFS to the local directory/home/user/local/
.After executing the
hadoop fs -copyToLocal
command, the specified file or directory from HDFS will be copied to your local file system. You can then access and work with the data on your local machine as needed.
Hadoop Training Demo Day 1 Video:
Conclusion:
Unogeeks is the No.1 IT Training Institute for Hadoop Training. Anyone Disagree? Please drop in a comment
You can check out our other latest blogs on Hadoop Training here – Hadoop Blogs
Please check out our Best In Class Hadoop Training Details here – Hadoop Training
Follow & Connect with us:
———————————-
For Training inquiries:
Call/Whatsapp: +91 73960 33555
Mail us at: info@unogeeks.com
Our Website ➜ https://unogeeks.com
Follow us:
Instagram: https://www.instagram.com/unogeeks
Facebook:https://www.facebook.com/UnogeeksSoftwareTrainingInstitute
Twitter: https://twitter.com/unogeeks