Skill set Must Have: Hadoop , Linux , RHEL, Ambari/Cloudera Deployment Platform, Big Data , HDFS
Responsibilities :
- Having 3+ years of experience in Hadoop
- Installing HDP/CDP in Linux environment.
- Experience On-premise applications having thorough knowledge of Linux commands and working
- Deployment in a Hadoop cluster and its maintenance.
- Health check of a Hadoop cluster monitoring whether it is up and running all the time.
- Analyse the storage data volume and allocating the space in HDFS.
- Resource management in a cluster environment. This involves new node creation and removal of unused ones.
- Configuring Name Node to ensure its high availability
- Implementing and administering Hadoop infrastructure on an ongoing basis.
- Required hardware and software deployment in Hadoop environment, furthermore to expanding of existing environments.
- Software installation and its configuration.
- Having understanding of networking and security layer of application and network
- Performance monitoring and fine tuning on actual basis.
- Managing and optimizing disk space for handling data
- Installing patches and upgrading software as and when needed.
- Automate manual tasks for faster performance.
- User creation in Linux for Hadoop and its components in the ecosystem. Moreover, setting up Kerberos principals is a part of Hadoop administration.
- Monitoring connectivity and security of Hadoop cluster
- Managing and reviewing log files in Hadoop.
- Management of HDFS file system and monitoring them.
- Communicating with other development, administrating and business teams. They include infrastructure, application, network, database, DS, and business intelligence teams.
- Coordinating with application teams. Installing the operating system and Hadoop related updates as and when required.
Qualifications
Bachelor's degree in computer science, information systems, or a related field