Hadoop Cluster Administrator -Kafka, Spark, Java, Scala
My client is a growing Data and Analytics service provider whose client base consists of global, high profile organisations. Due to company growth, they are seeking a commercial and driven Hadoop Cluster Administrator to ensure their platform can be scaled on an international level.
- 6 month contract
- Competitive day rate
The following are key responsibilities related to the role:
- Work alongside the Analytics team to create and manage our Hadoop platform to produce the data required
- Liaise with the OPS team, advising and consulting them on tasks such as firewall and load balancer changes
- Advise and manage growth in scale as as customer base increases
- Take responsibility for keeping services online and available
- Keep track of our security infrastructure, networks, application tiers
The following skills and experience are essential:
- Strong experience supporting complex, large-scale Production environments
- Experience of Hortonworks HDP (Hortonworks Distribution) in production environments
- Experience with data replication
- Knowledge of network Security (FreeIPA Kerberos, Ranger, Knox etc)
- Excellent understanding of Linux/UNIX administration
- Knowledge of relevant administrative tools used, such as YARN, Kafka, Spark, Storm, Pig, Hbase, HDFS, Flink, Hive, and Oozie
What you need to do now
If you're interested in this role, click 'apply now' to forward an up-to-date copy of your CV, or call us now.
If this job isn't quite right for you but you are looking for a new position, please contact us for a confidential discussion on your career.