Ab Initio with Hadoop Developer

About Capgemini

With more than 211,300 people in over 40 countries, Capgemini is one of the world's foremost providers of consulting, technology and outsourcing services. The Group reported 2018 global revenues of EUR 13.2 billion. Together with its clients, Capgemini creates and delivers business, technology and digital solutions that fit their needs, enabling them to achieve innovation and competitiveness. A deeply multicultural organization, Capgemini has developed its own way of working, the Collaborative Business Experience™, and draws on

Rightshore®, its worldwide delivery model.

  • Bachelor s degree in a quantitative field such as Engineering Computer Science Statistics Econometrics and a minimum of 10 years of experience
  • Minimum 5 year’s experience working with and developing big data solutions
  • Experts in the following Ab Initio tools GDE Graphical Development Environment Co Operating System Control Center Metadata Hub Enterprise Meta Environment Enterprise Meta Environment Portal Acquire It Express It Conduct It Data Quality Environment Query It
  • Hands on experience on writing shell scripts complex sql queries Hadoop commands and Git
  • Ability to write abstracted reusable code components Programming experience in at least two of the following languages Scala Java or Python

Desired Characteristics

  • Strong business acumen, Critical Thinking and Creativity, Performance tuning experience
  • Experience in developing Hive Sqoop Spark Kafka HBase on Hadoop Familiar with Ab Initio Horton works Zookeeper and Oozie is a plus
  • Willingness to learn new technologies quickly Superior oral and written communication skills as well as the willingness to collaborate across teams of internal and external technical staff business analysts software support and operations staff
Job Responsibilities

Title – Ab Initio with Hadoop Developer

Location – Chicago, IL


Senior Data Engineer to be part of our scrum teams and perform functional system development for Hadoop applications for our Enterprise Data Lake initiative This is high visibility fast paced key initiative will integrate data across internal and external sources provide analytical insights and integrate with our critical systems

Essential Responsibilities:

  • Participate in the agile development process Develop functional and technical specifications from business requirements for the commercial platform
  • Ensure application quality and adherence to performance requirements
  • Help create project estimates and plans Represent engineering team in project meetings and solution discussions
  • Participate code review process Work with team members to achieve business results in a fast paced and quickly changing environment
  • Pair up with data engineers to develop cutting edge Analytic applications leveraging Big Data technologies Hadoop NoSQL and In memory Data Grids Mentor and influence up and down the chain of command
  • Perform other duties and or projects as assigned



Posted on:

August 19, 2019

Experience level:

Experienced (non-manager)

Education level:

Bachelor's degree or equivalent

Contract type:

Permanent Full Time



Business units:



Financial Services


By continuing to navigate on this website, you accept the use of cookies.

For more information and to change the setting of cookies on your computer, please read our Privacy Policy.


Close cookie information