Big Data Platform Engineer

A global leader in consulting, technology services and digital transformation, the Capgemini Group is at the forefront of innovation to address the entire breadth of clients’ opportunities in the evolving world of cloud, digital and platforms. Building on its strong 50-year heritage and deep industry-specific expertise, Capgemini enables organizations to realize their business ambitions through an array of services from strategy to operations. Capgemini is driven by the conviction that the business value of technology comes from and through people. It is a multicultural company of over 200,000 team members in more than 40 countries. The Group reported 2018 global revenues of EUR 13.2 billion. People matter, results count. Learn more about us at 


Let’s talk about the team:

Our Insights and Data team helps our clients make better business decisions by transforming an ocean of data into streams of insight. Our clients are among Australia’s top performing companies and they choose to partner with Capgemini for a very good reason – our exceptional people.Due to continued growth within Capgemini’s Insights & Data practice we intend to recruit a Big data Platfomr Engineer with relevant consulting and communication skills. If you are already working in a consultancy role, or. have excellent client-facing skills gained within large organizations, we would like to discuss our consultant opportunities with you. 

Let’s talk about your qualifications and experience:  

The successful candidate will be responsible for the following:

  • Implementing, managing and administering a Cloudera based Hadoop infrastructure.
  • Working closely with the data science team,  BI team and application teams to make sure that all the big data applications are highly available and performing as expected.
  • Responsible for monitoring and maintaining the health of Hadoop clusters
  • Responsible for capacity planning , resource and security management
  • Performance tuning of jobs and clusters
  • Manage and review Hadoop log files
  • Perform Backup and recovery tasks

To be considered for this role, you must have: 

  • At least one end to end Big Data implementation project using Cloudera
  • Experience in administration of Hive, HBase, Spark, YARN, Impala, Kafka, etc
  • Security administration – Kerberos, Active Directory, Ranger, Sentry, Cloudera Navigator
  • Experience in handling incidents and service requests
  • Strong scripting (Shell/Python) experience
  • DevOps knowledge/experience will be highly regarded
  • Good communication skills 

Please note, we can only consdier Australian Citizens for this role.


What happens next and what can we offer you?

Interested?  Passionate people are Capgemini’s Ace of spades. We believe that every one of us is an architect of positive futures.  We invite you to join us to discover a career that will challenge, support and inspire you. Working at Capgemini you’ll find the rewards are more than just financial. Not only will you work alongside inspiring colleagues with a world of experience, but you’ll also have access to great benefits including, salary continuance insurance, paid parental leave, education assistance, salary packaging, the ability to purchase additional leave; as well as discounts on entertainment, financial and wellbeing services, travel and shopping.  Talk to us about working part-time or. full time. 

Ranked among Ethisphere’s 2019 Most Ethical Companies in the World (for the 7th year running!), our seven values are at the heart



Posted on:

July 10, 2019

Experience level:


Education level:

Bachelor's degree or equivalent

Contract type:





DS - I&D - Big Data


By continuing to navigate on this website, you accept the use of cookies.

For more information and to change the setting of cookies on your computer, please read our Privacy Policy.


Close cookie information