Snowflake Data Engineer

Job Overview

The purpose of this role is to lead the design and implementation of Snowflake-based analytics solution (data lake and data warehouse) on AWS. This will include: Data and analytics requirements definition, source data analysis and profiling, logical and physical design of the data lake and data warehouse as well as design of data integration and publication pipelines.


  • Making recommendations on the use of ETL vs. ELT techniques, various components of Snowflake, configuration, design, administration of Snowflake environment.
  • Analyze the current analytics environment and make recommendation for appropriate data warehouse modernization and migration to the cloud.
  • Develop Snowflake deployment and usage best practices.
  • Helping educate the rest of the team members on the capabilities, best practices, and the limitations of Snowflake.


Required Skills & Experience

  • Must have total 2+ years of experience working as a Snowflake Data Engineer and prefferably 8 years in Data Warehouse, ETL, BI projects.
  • Must have prior experience with end-to-end implementation of Snowflake cloud data warehouse and end to end data warehouse implementations.
  • Expertise in Snowflake – data modelling, ELT using Snowflake SQL, implementing complex stored Procedures and standard DWH and ETL concepts
  • Expertise in Snowflake advanced concepts like setting up resource monitors, RBAC controls, virtual warehouse sizing, query performance tuning, Zero copy clone, time travel and understand how to use these features
  • Expertise in deploying Snowflake features such as data sharing
  • Hands-on experience with Snowflake utilities, SnowSQL, SnowPipe, Big Data model techniques using Python
  • Experience in Data Migration from RDBMS to Snowflake cloud data warehouse
  • Deep understanding of relational as well as NoSQL data stores, methods and approaches (star and snowflake, dimensional modelling)
  • Experience with data security and data access controls and design
  • Experience with AWS or Azure data storage and management technologies such as S3 and ADLS
  • Build processes supporting data transformation, data structures, metadata, dependency and workload management




  • Education level – Degree in Computer Science, Information Systems or related field
  • Experience in Agile / Scrum projects advantageous
  • A minimum of 5 to 8 years of industry experience




Posted on:

July 30, 2021

Experience level:

Experienced Professional

Contract type:

Permanent Full Time



Business units:

I and D Global Business Line


Big Data & Analytics