Profile cover photo
Profile photo
Balaswamy Vaddeman
268 followers -
married to bigdata
married to bigdata

268 followers
About
Posts

Post has attachment
How to clear Hortonworks administrator (HDPCA) certification?
In Hadoop administration, We have three certifications provided by MapR  (MCCA), Cloudera (CCA) and Hortonworks (HDPCA).I have cleared Hortonworks administrator certification (HDPCA) and also I helped many of my friends to clear the Hortonworks administrato...
Add a comment...

Post has attachment
Importing data from RDBMS to Hadoop using Apache Sqoop
In this article , We will learn how to import data from RDBMS to HDFS using Apache Sqoop. We will import data from Postgres to HDFS in this article. 1) Check postgres is running. We will use service command to check the status of Postgres . The command belo...
Add a comment...

Post has attachment
Enabling namenode HA using Apache ambari
In this article we will learn how to enable high availability for name node.  Nam node high availability has more than one name node. One of the name nodes will be active and it will responsible for serving user requests. Other namenodes will be in stand by...
Add a comment...

Post has attachment
Enabling Resource manager HA using Ambari
In this article , We will learn how to enable Resource manager (RM) High availability (HA) using Apache Ambari. In resource manager high availability , Hadoop cluster will have two or more resource managers.One resource manager will be active and other reso...
Add a comment...

Post has attachment
Log files in Hadoop eco system
In this article , We will learn how to check log files of hadoop daemons and how to read log files of  applications and jobs. 1)  Locate log directory in Apache Ambari First We need to know log directory for hadoop daemons.  We can use Apache ambari to find...
Add a comment...

Post has attachment
Enabling rack awareness for Hadoop cluster
In this article , We will learn how to enable rack awareness in hadoop clusters. Assume that cluster has large  number of nodes and nodes are placed in more than one rack. If we enable rack awareness , all blocks will not be stored in one rack so that we ca...
Add a comment...

Post has attachment
Creating and configuring home directory for a user in HDFS.
In this article , We will learn how to create home directory in HDFS for a new user. Every user should have home directory in HDFS if he/she wants to access HDFS. Some hadoop jobs use user's home directory to store intermediate/temporary data . Jobs will fa...
Add a comment...

Post has attachment
Decommissioning of Node manager in Hadoop cluster
In this article , We will learn how to perform decommissioning of the node managers in Hadoop clusters. Decommissioning process will ensure running jobs moved to different node managers without failing them. 1) Check Ambari UI If you are using HDP (Hortonwo...
Add a comment...

Post has attachment
Decommissioning datanodes in Hadoop cluster
In this article , We will learn how to perform decommissioning of data nodes in a Hadoop cluster. Decommissioning process of the datanode ensures that data is transferred to other nodes so that existing replication factor is not disturbed. 1) Check NameNode...
Add a comment...

Post has attachment
Installation and configuration of Ranger using Apache Ambari
In this article , We will learn how to install and configure Apache Ranger using Apache Ambari.Apache Ranger is a security technology that provides policy based security for all hadoop eco system tools. 1) Login into Ambari GUI. 2) Click on Add service Clic...
Add a comment...
Wait while more posts are being loaded