Ab initio (With Hadoop) developer – 6 to 9 years – Hyderabad
Duties & Responsibilities:
• Develop key modules independently using PDL and Metaprogramming construct using all Ab Initio components
• Develop Hive scripts, unix shell scripts, autosys jobs as needed.
• Identify key business processes and deliver ETL processes to load physical data models
• Operationalize Ab Initio applications using control center.
• Participates in day-to-day execution of the architecture strategy
• Source huge volume of data from diversified data platforms into Hadoop platform.
• Support or design complex ETL production environments
• Release management including implementation planning.
• Review functional requirement specifications with Business and Quality Analysts to align project objectives.
• Work within project to develop, communicate, and mentor others on design/implementation standards, guidelines and best practices
Skill, Experience & General information required:
• Hands on ETL development (data transformations and data movement) using Ab Initio, Ab Initio Acquire >It, BRE, ACE, Express>It, Control Center, DQE (Data Quality) and Metadata Portal as key tools
• Expertise in Hadoop echo systems HDFS (MapR preferred), Hive, HBase, Map Reduce, for scalability, distributed computing and high performance computing
• Extensive HiveQL and SQL experience with multiple databases and ETL development
• Honed skills in writing Unix shell scripts, application development and implementation experience
• Experience with automation tools like AutoSys, code promotion tools
• System Development Life Cycle and application production support experience
• Ability to work with huge volumes of data to derive Business Intelligence
• Experience in Ab Inito & Big data Hadoop integration
• Experience with Python, Apache Spark, Core or J2EE development is additional plus
• Strong communication skills and willingness to learn new technologies & applications a must!