Job Description
- Ideally you will have a background in software engineering along with experience in working in a complex cloud environment (Google Cloud Platform, AWS or Azure preferably)
- Ideally you will have experience working with Python and Kubernetes
- You have experience building data pipelines and ETL frameworks both batch and real time using Python and any of the GCP capabilities such Apache Beam, Data Flow, Data Fusion
- You have Experience with using Terraform to build Infrastructure-as-a-code
- Experience working with Big Data Technologies (Spark, Cloud SQL, Big Query) preferably in an Enterprise environment
- Candidate should have 5 to 10 year of IT experience
Primary Skills
- You will have experience with automated CI-CD and testing tools and approches
- Experienced working with operations and architecture groups developing scalable and supportable solutions (desired but not mandatory)
- Understand customer needs to make sound judgments
- Pays attention to the detail and demonstrates problem solving capability to develop and deliver quality solutions
Secondary Skills
- Experience with Airflow is desirable
- Understanding of current state landscape and relevant technologies