Hadoop Copy From Local to HDFS
To copy files or directories from your local file system to the Hadoop Distributed File System (HDFS), you can use the hadoop fs
command with the -copyFromLocal
option. This command allows you to upload files from your local machine to an HDFS directory. Here’s the basic syntax:
hadoop fs -copyFromLocal <local-source> <hdfs-destination>
<local-source>
: Specifies the path to the file or directory on your local file system that you want to copy to HDFS.<hdfs-destination>
: Specifies the destination directory in HDFS where you want to store the file or directory.
For example, if you want to copy a local file named example.txt
from your local machine to the /user/hadoop/data/
directory in HDFS, you would use the following command:
hadoop fs -copyFromLocal example.txt /user/hadoop/data/
Make sure that you have the necessary permissions to write to the specified destination directory in HDFS. If the destination directory doesn’t exist, the -copyFromLocal
command will create it.
You can also copy entire directories from your local file system to HDFS by specifying the directory’s path as the <local-source>
. For example:
hadoop fs -copyFromLocal /path/to/local/directory /user/hadoop/data/
Hadoop Training Demo Day 1 Video:
Conclusion:
Unogeeks is the No.1 IT Training Institute for Hadoop Training. Anyone Disagree? Please drop in a comment
You can check out our other latest blogs on Hadoop Training here – Hadoop Blogs
Please check out our Best In Class Hadoop Training Details here – Hadoop Training
Follow & Connect with us:
———————————-
For Training inquiries:
Call/Whatsapp: +91 73960 33555
Mail us at: info@unogeeks.com
Our Website ➜ https://unogeeks.com
Follow us:
Instagram: https://www.instagram.com/unogeeks
Facebook:https://www.facebook.com/UnogeeksSoftwareTrainingInstitute
Twitter: https://twitter.com/unogeeks
Hadoop SQL Server