Data Engineer – Melbourne

A global leader in consulting, technology services and digital transformation, Capgemini is at the forefront of innovation to address the entire breadth of clients’ opportunities in the evolving world of cloud, digital and platforms. Building on its strong 50-year heritage and deep industry-specific expertise, Capgemini enables organizations to realize their business ambitions through an array of services from strategy to operations. Capgemini is driven by the conviction that the business value of technology comes from and through people. It is a multicultural company with 200,000 team members in over 40 countries. In 2017 the Group achieved €12.8 billion in revenue. Learn more about us at www.capgemini.com

Our Insights and Data team helps our clients make better business decisions by transforming an ocean of data into streams of insight. Our clients are among Australia’s top performing companies and they choose to partner with Capgemini for a very good reason – our exceptional people.Due to continued growth within Capgemini’s Insights & Data practice we intend to recruit a number of Data Engineers with relevant consulting and communication skills. If you are already working in a consultancy role, or have excellent client-facing skills gained within large organizations, we would like to discuss our consultant opportunities with you.

 

About the role

The Data Engineer will expand and optimise our clients’ data and data pipeline architecture, as well as optimise their data flow and collection for cross functional teams. Your responsibilities include:

  • Build robust, efficient and reliable data pipelines consisting of diverse data sources to ingest and process data into Hadoop data platform.
  • Design and develop real time streaming and batch processing pipeline solutions
  • Assemble large, complex data sets that meet functional / non-functional business requirements.
  • Design, develop and implement data pipelines for data migration & collection, data analytics and other data movement solutions.
  • Work with stakeholders including the Product Owner and data analyst teams to assist with data-related technical issues and support their data infrastructure needs.
  • Collaborate with Architects to define the architecture and technology selection.

About you:

You will have the ability to optimise data systems and build them from the ground up. You will support software developers, database architects, data analysts and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects.

Essential skills and experience

  • Proven working experience as Big Data engineer for 2+ years preferably in building data lake solution by ingesting and processing data from various source systems on AWS cloud 
  • Experience with multiple Big data technologies and concepts such as HDFS, NiFi, Kafka, Hive, Spark, Spark streaming, HBase , EMR and Redshift on AWS 
  • Experience in one or more of Java, Scala, python and bash.
  • Ability to work in team in diverse/ multiple stakeholder environment
  • Experience in working in a fast-paced Agile environment
  • BS in Computer Science, Statistics, Informatics, Information Systems or another quantitative field
  • Implement test cases and test automation.
  • Apply DevOps, Continuous Integration and Continuous Delivery principles to build automated pipelines for deployment and production assurance on the data platform.
  • Share knowledge with immediate peers and build communities and connections that promote better technical practices across the organisation
  • Knowledge of and/or experience with Big Data integration and streaming technologies (e.g. Kafka, NiFi, Flume, etc.)

Ref:

CAP/1398042/HA

Posted on:

May 22, 2019

Experience level:

Experienced (non-manager)

Education level:

Bachelor's degree or equivalent

Contract type:

Permanent

Location:

Melbourne

Department:

DS - I&D - Big Data

cookies.

By continuing to navigate on this website, you accept the use of cookies.

For more information and to change the setting of cookies on your computer, please read our Privacy Policy.

Close

Close cookie information