Apache pig is meant for processing huge amount of data that gets stored on top of HDFS.
The processing will be carried out in apache pig by making use of different transform actions like load, Generate, filter etc.
So, we can call apache pig as transformation language (or) Data flow language.
So the data has to go through this transformation to archive the dizer functionality.
Note: Apache Pig is a abstract layer or high level language on top of HDFS as every statement of the pig is internally getting converted into MR.
1. In MapReduce, for processing data we have to write the driver code, Mapper code and Reduces code (if required) irrespective of business logic that we are applying Where as in Apache pig, we can archive some functionality by making use of scripting language with less number of lines of coding.
2. MapReduce is expecting Java programming language skills where as in apache pig even a non java programming member can write the code using simple scripting.
3. 200 lines of MR code is equal to 10 lines of a pig code.
4. In Map reduce, we have to follow scripting process something like compilation of MR code, Executing code, packaging code and deploy in cluster where as in apache pig, it is very easy to run the code without involving many steps
Pig runs as a client – side application
If you want to run pig on a hadoop cluster, there is nothing extra to install on the cluster i.e. pig launches jobs and interacts with HDFS or other Hadoop file systems from your work station.
Installation is straight forward and Java 6 is a prerequisite.
Download a stable release from http://pig.apache.org/release.html and un place the tar ball in a suitable place on your work station i.e % tar xzf pig – x.y.z tar. Gz.
It’s convenient to add pig’s bin directory to your command line path.
For Example: % export PIG-INSTALL=/home/tom/pig-x.y.z export PATH = $ PIG-INSTALL/bin
? You also need to set the JAVA-HOME environment variable to point to a suitable Java Installation.
Provide the command pig-help to get usage instructions.
Get Updates on Tech posts, Interview & Certification questions and training schedules