- Experience in working in a complex cloud environment Google Cloud Platform ,AWS or Azure.
- Ideally you will have experience working with Python and Kubernetes
- You have experience building data pipelines and ETL frameworks both batch and real time using Python and any of the GCP capabilities such Apache Beam, Data Flow , Data Fusion
- You have Experience with using Terraform to build Infrastructure as a code
- Experience working with Big Data Technologies Spark Cloud SQL Big Query preferably in an Enterprise environment
- Experience with Airflow is desirable
- You will have experience with automated CI CD and testing tools and approaches
- Experienced working with operations and architecture groups developing scalable and supportable solutions desired but not mandatory.
- Understand customer needs to make sound judgments
- Pays attention to the detail and demonstrates problem solving capability to develop and deliver quality solutions
- Understanding of current state landscape and relevant technologies.