AWS (with PySpark) | 4 to 6 years | Pune & Bengaluru

Job Description
  • The DataOps Engineer is a highly skilled expert in their field of excellence in AWS Cloud
  • The role of DataOps collaborates with software developers system operators and other IT staff members
  • The role will often cross and merge the barriers that exist between software development testing and operations teams and keep existing networks in mind as they design plan and test
  • Some of the typical work may include Designed tools for managing the infrastructure and programmed clean re usable simple codes
  • Developed code for extensive test coverage performed continuous deployment in a professional software engineering environment
  • Worked on various platforms with different programming languages and supported the production cluster management system
  • Configured server images optimized the task performance in correspondence with the engineers
Primary Skills
  • AWS native design patterns processes and best practices
  • Expertise in PySpark / Python scripting
Secondary Skills
  • Nice to have implemented DevOps practices tools in areas such as continuous integration Ability to set up work with E2 instances
  • Experience with building automated test pipelines and use Splunk Cloudwatch Cloudtrail for alerts and notifications
  • Developer background experience with one of the following languages Java Spring Ruby Scala J2EE
  • Build and deploy on platforms such as Buildkite Jenkins Travis CI Bamboo GitLab TeamCity Artifactory Nexus etc
  • Experience working with or implementing of monitoring and logging solutions like ELK Prometheus Splunk AppDynamics Sumo Logic New Relic etc
  • Experience with branching models in Git
  • Preferably AWS Certified professional Qualifications The following qualifications would be advantageous

Ref:

491225

Posted on:

August 21, 2020

Experience level:

Experienced

Contract type:

Permanent

Location:

Pune