We are looking for an experienced Hadoop Administrator to manage, optimize, and support enterprise-level Big Data platforms running on Cloudera or Hortonworks distributions.
The ideal candidate will ensure platform stability, performance, and security across complex data ecosystems.
Key Responsibilities
Install, configure, and maintain Hadoop clusters (HDFS, YARN, Hive, Spark, Kafka, Zookeeper).
Perform capacity planning, performance tuning, upgrades, and patch management.
Manage user access, Kerberos authentication, Ranger/Sentry policies, and data security controls.
Monitor system health and troubleshoot performance or node-related issues using Cloudera Manager or Ambari.
Implement backup, recovery, and disaster recovery strategies.
Automate administrative tasks via Shell/Python scripting and support CI/CD integration.
Collaborate with data engineering and DevOps teams to optimize data workflows and ensure system reliability.
Required Skills
Strong expertise in Hadoop ecosystem management (Cloudera/Hortonworks).
Hands-on with Linux (RHEL/Ubuntu), Kerberos, Ranger/Sentry, Jenkins, and Git.
Proficient in monitoring tools (Cloudera Manager, Geneos, CloudWatch).
Familiar with AWS (S3, EMR, IAM, VPC) and hybrid data environments.
Scripting experience in Shell/Bash/Python for automation.
Preferred Certifications: Cloudera Essentials for CDP
#J-18808-Ljbffr