A contribution section that includes projects from the Hadoop
Hadoop consists of two core components:
Hadoop is comprised of fire separate daemons they are
We can consider nodes to be in two different categories:
Master Nodes: Run the Name Node, Secondary Name Node, Job Tracker daemons.
Slave Nodes: Run the Data node and task tracker daemons and a slave node will run both of these daemons
|Big Data On AWS||Informatica Big Data Integration|
|Bigdata Greenplum DBA||Informatica Big Data Edition|
|Hadoop Testing||Apache Mahout|
Ravindra Savaram is a Content Lead at Mindmajix.com. His passion lies in writing articles on the most popular IT platforms including Machine learning, DevOps, Data Science, Artificial Intelligence, RPA, Deep Learning, and so on. You can stay up to date on all these technologies by following him on LinkedIn and Twitter.