About our team:
Our Insights and Data team helps our clients make better business decisions by transforming an ocean of data into streams of insight. Our clients are among Australia’s top performing companies and they choose to partner with Capgemini for a very good reason – our exceptional people!
About our role:
You will expand and optimise our clients’ data and data pipeline architecture, build, optimise, and maintain conceptual and logical database models as well as optimise the data flow and collection for cross functional teams.
- You will design and implement big data platform on AWS cloud to support variety of use cases including data science, machine learning, BI and reporting
- You will define strategy and architecture to migrate data from legacy systems to new big data solutions, data pipelines for data collection, data analytics and other data movement solutions.
- You will assess and propose the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using Big data ETL, Database Replication, Application Integration, SQL and NoSQL technologies.
- You will provide technical leadership to project team to perform design to deployment related activities, provide guidance, perform reviews, prevent and resolve technical issues.
- You will work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs
- You will assess and propose methods to improve system performance by conducting tests, troubleshooting and integrating new elements
- You will define security and backup procedures
Most importantly about you:
- You will have demonstrable experience of working on Big Data Technologies like Hadoop, Spark, Hive, HBase, Avro/Parquet, Sqoop.
- You have extensive experience supporting large production Hadoop platforms.
- You are experienced in SQL-based, NoSQL-based and MPP datastores.
- You are able to architect and design both streaming and batch use cases in a big data platform.
- You are an expert in in building Big Data/Data Lake solutions using AWS services – S3, Kinesis, Lambda, EMR, Glue, Athena, Redshift, AWS CLI etc.
- You develop a vision for information delivery and management and driving execution of the roadmap, including enterprise data architecture, big data, analytics and data management.
- You can mentor and support other team members to achieve team outcomes and facilitate a motivated and hard-working environment.
- You are great at building relationships with internal and/or external clients.
- You have previously written and produced technical documentation and knowledgebase articles.
- You think strategically about business, product, and technical challenges in an enterprise environment.
- You have demonstrable experience designing highly available, highly scalable production systems.
- You are familiar with the Elastic stack, including Elasticsearch, Logstash and Kibana beneficial.
- You are familiar with ETL tools such as Informatica (PowerCenter, BDM) or Talend.
- If you have knowledge in Architecture frameworks like TOGAF this will be highly regarded
What we’re offering:
Working at Capgemini you’ll find the rewards are more than just financial. Not only will you work alongside inspiring colleagues with a world of experience, but you will also have access to great benefits including, salary continuance insurance, education assistance, salary packaging, the ability to purchase additional leave, as well as, discounts on entertainment, financial and wellbeing servi