AWS | 6 to 9 years | Hyderabad & Pune

Job Description
  • Minimum of 4 years experience in a combination of data modeling and data architecture
  • Design implement and support data pipelines and ETL processes for collection storage processing and transformation of data
  • Deep understanding and hands on experience in modern data processing technology in the AWS stack lambda python DynamoDB RDS etc
  • Advanced SQL knowledge and experience working with relational databases query authoring SQL as well as working familiarity with a variety of databases
  • Analyze and translate business requirements into functional and technical requirements and design
  • Developing data models and architecture the data flow for the insurance application products
  • Understand and document Project Product data
  • Requirement Capture and document Meta Data Transaction data elements
  • Coordinate between different teams sources of information
  • Participate in project meetings and capture data requirements
Primary Skills
  • AWS (Lambda function, AWS Glue, Step function (Orchestration / Automation / like Airflow)
  • Python / PySpark
Secondary Skills
  • DevOps
  • Snowflake

Ref:

476805

Posted on:

September 30, 2020

Experience level:

Experienced

Contract type:

Permanent

Location:

Hyderabad

Department:

Financial Services