Hadoop Developer

Posted 4 days ago by REED Business Support
Easy Apply Featured

 An award winning educational facility is looking for a Hadoop developer to come and join them to work on their greenfield Data Lake solution, taking on design and development responsibilities. If you want the opportunity to go somewhere greenfield and work on something really special then this could be the one for you.

Essential role requirements:

• At least 12 months experience of Hadoop components, specifically Sqoop, Hive, Spark and Impala

• Extensive experience in modelling data for the purposes of analytics including the definition star schemas

• At least 2 years’ experience in the use of graphical ETL tools such as Talend and Informatica

• At least 1 years’ experience of experience in one of more of Java, Python, R and Scala, with proven experience in their use within the context of Spark and Spark MlLib

• Extensive experience of SQL and SQL performance tuning

• Experience of producing design documentation and specifications for and data modelling and data transformation work

Desirable:

• Talend ETL/Talend Big Data

• Exposure to Oracle Warehouse Builder and PL/SQL

• An understanding of how to architect and maintain Hadoop clusters with knowledge of Cloudera Manager

• Experience of working in a dev/ops environment and experience of agile frameworks such as Scrum

• Experience of data modelling toolsets such as Sparx EA

• Exposure to, or experience to, data mining and machine learning.

Reference: 39495289

Bank or payment details should never be provided when applying for a job. For information on how to stay safe in your job search, visit SAFERjobs.

Report this job