Mindmajix

CLI Commands in Hadoop with Syntaxes

Hadoop CLI Commands

1) Hadoop fs –: Even hadoop CLI Command should start this command only.

Syntax:- root @local host # hadoop fs – command

2) Hadoop fs –ls/ To see all the directories and files in the hadoop file system.

Syntax:- root @local host # hadoop fs – ls/

3) Hadoop Fs –ls/ user: To see all the directories and files of user in the hadoop file system.

Syntax:- root @local host # hadoop fs – ls/users

4) Hadoop fs –ls/ user/root: To see  the list of directories and files of /user/root  directory  in hadoop file system

Syntax:- root @local host # hadoop fs – ls/user/root 

5) Hadoop fs –mkdir: To create a new directory  in the hadoop file system

a) Syntax:- root @local host # hadoop fs – mkdir test.

The directory fest  will be created in default directory of hadoop file system i.e/user/root

 To check, the command is

root @local host # hadoop fs – ls/user/root 

Output will be:

Drwar-xr-x –root supergroup 0 2011-07-29 12:25

b) Syntax:- root @local host # hadoop fs – mkdir/user/root/test1/user/root/fest

The new directory will be crested in the specified pash.

6) Hadoop Fs –ls/user/root /grcp di name: Used to return the lines which has the name specified after grcp

Example:- root @local host # hadoop fs –ls/user/root / frcp test

Output: drwxr-xr-x-root super group 2011-07-09 12:25 /user/root / test

drwxr-xr-x-root super group 2011-07-09 12:32 /user/root / test

7) Hadoop fs –put: To more the file from local system to Hadoop file system.

Syntax:- root @local host # hadoop fs – put <source path><destination path>

Example:- root @local host # hadoop fs –put input.txt user/test

8) Hadoop Fs –cat: To see the output of the file.

Syntax:- root @local host # hadoop fs – cat <filename>

9) Hadoop fs –lsr: To see the recursive list of files and directories

Syntax:- root @local host # hadoop fs – lsr /user

10) Hadoop fs –du: To see the size of the files

Syntax:- root @local host # hadoop fs – du<file name>

11) Hadoop fs –chmod 777: To give the full permissions to the file

Syntax:- root @local host # hadoop fs – chmod 777<file name>

12) Hadoop fs –copy from local: To copy the file from local system to HDRS

Syntax:- root @local host # hadoop fs – copy from local input-txt/user/rock fest

13) Hadoop fs –get: To get one files from HDFS to the local file system

Syntax:- root @local host # hadoop fs – get<source path><destional path>

14) Hadoop fs –copy to local: To copy the files from HDRS to the local file system

Syntax:- root @local host # hadoop fs – copy to local<HDRS path><local path the file name the file>

15) Hadoop fs –move from local: To move files from local to hdfs

Ex:-1#hadobp fs –move from local test/file1.txt hdfstest.

2#hadobp fs –move from local test/* hdfstest

16) Hadoop fs –move ToLocal: Is not possible, because there are many replicas of the file is present on HDFS. so moving is not possible.

17) Hadoop Fs –rm: To remove the files from directory i.e same as in local

Ex: a.  #hadobp fs –rm hdfs test/*-*

                  Remove all files from the directory

       b.  #hadobp fs –rm hdfs test/file1.txt

                  Remove the files 1.true from the directory.

18) Hadoop fs –rmv: To remove the dir.

Ex: #hadobp fs –rmv hdfs test

                                         Directory name

19) Hadoop fs –mv: Used to move files from one directory to other directory in hdfs. i.e files not available in source directory.

20) Hadoop fs –cp: To copy the files from one directory to another directory.

Ex: #hadobp fs –cp hdfs test1/* hdfstest

All files

21) Hadoop fs –touch z: It will create the dummy file with

Ex: #hadobp fs –touch z hdfs test/file1.txt

                               File1 is created by

Enroll for Live Instructor Led Online Hadoop Training


0 Responses on CLI Commands in Hadoop with Syntaxes"

Leave a Message

Your email address will not be published. Required fields are marked *

Copy Rights Reserved © Mindmajix.com All rights reserved. Disclaimer.
Course Adviser

Fill your details, course adviser will reach you.