This blog discusses Oozie and how it may be used to manage Hadoop workloads. It begins with an examination of the components necessary for workflow coding.
To install and run Oozie using an embedded to meat server and an embedded Derby database
Unix(tested in linux and Max osk)
Java 1.6 +
Hadoop 0.20 and 1.0.0
Ext Js Library (optional, to enable oozie web console)
ExtJS 2.2
Note: Java 1.6+bin directory should be in the command path
Note:
ExtJS library is not bundled with oozie, because it uses a different license and recommended to use a oozie Unix user for one oozie server.
Frequently asked Hadoop Interview Questions
oozie serup.sh
oozie start.sh all run only under the
oozie run.sh Unix user that owns the
oozie stop.sh oozie installation directory
Use the oozie- setup.sh script to add the Hadoop JARS and ExtJS Library to Oozie.
$bin/
oozie
– setup.sh-
hadoop
0.20.200
${
hadoop
-Home}-ExtJS hmp/ext-2.2.zip
To start oozie as a daemon process run the command as
$bin/
oozie
- start.sh
To start oozie as a foreground process, run the command as
$bin/
oozie
- run.sh
Check the oozie.log file logs/ oozie.log to ensure. Oozie. started properly.
To Check the status of Oozie using the Oozie command like tool is
$bin/
oozie
admin-
oozie
HTTP://LOCALhost:11000/
oozie
-status
Oozie status should be normal when using the Oozie web console
Copy and expand the Oozie- client TAR.GZ bundled with the distribution.
5 Add the bin/direction to the PATH
Note: The Oozie server installation includes the Oozie client which should be installed in remote machines only.
Expand the Oozie- sharelib TAR.COZ .file bundled with the distribution
Shared directory must be copied to the Oozie HOME directory in HDFS:
$hadoop fs-put share
Note:
This must be done using the OozieHadoopp (HDFS) and if shared directory already exists in HDRS, it must be deleted before copying again.
Hadoop Administration | MapReduce |
Big Data On AWS | Informatica Big Data Integration |
Bigdata Greenplum DBA | Informatica Big Data Edition |
Hadoop Hive | Impala |
Hadoop Testing | Apache Mahout |
Our work-support plans provide precise options as per your project tasks. Whether you are a newbie or an experienced professional seeking assistance in completing project tasks, we are here with the following plans to meet your custom needs:
Name | Dates | |
---|---|---|
Hadoop Training | Dec 24 to Jan 08 | View Details |
Hadoop Training | Dec 28 to Jan 12 | View Details |
Hadoop Training | Dec 31 to Jan 15 | View Details |
Hadoop Training | Jan 04 to Jan 19 | View Details |
Ravindra Savaram is a Technical Lead at Mindmajix.com. His passion lies in writing articles on the most popular IT platforms including Machine learning, DevOps, Data Science, Artificial Intelligence, RPA, Deep Learning, and so on. You can stay up to date on all these technologies by following him on LinkedIn and Twitter.