Job Responsibilities

Role: Talend Developer

Job Description:

Design, develop, validate and deploy the Talend ETL processes 
• Must have used HADOOP (PIG,HIVE,SQOOP) on MapR Distribution
• Responsible for the documentation of all Extract, Transform and Load (ETL) processes
• Maintain and enhance ETL code, work with the QA and DBA team to fix performance issues
• Collaborate with the Data Warehouse team to design and develop required ETL processes, performance tune ETL programs/scripts.
• Work with business partners to develop business rules and business rule execution 
• Perform process improvement and re-engineering with an understanding of technical problems and solutions as they relate to the current and future business environment.
• Design and develop innovative solutions for demanding business situations
• Analyze complex distributed production deployments, and make recommendations to optimize performance
Essential skills
• Minimum 6 years’ previous Data Management experience
• Java is a must 
• Minimum 2 years ETL experience with Talend and Big Data strongly preferred, may consider experience with Informatica or Datastage as an alternate
• Proficiency with MapR Hadoop distribution components and custom packages
• Proven understanding and related experience with Hadoop, HBase, Hive, Pig, Sqoop, Flume, Hbase and/or Map/Reduce
• Excellent RDBMS (Oracle, SQL Server) knowledge for development using SQL/PL SQL
• Basic UNIX OS and Shell Scripting skills
• Strong initiative with the ability to identify areas of improvement with little direction
• Team-player excited to work in a fast-paced environment; Agile experience preferred
• Bachelor’s degree in computer science/data processing or equivalent

Apply now