Test data engineer

Job Title: ETL Tester

Location: NYC (Remote till Covid ends)

Job Description:

•    Overall 8 years of experience in IT
•    experience with minimum of 6 years of ETL Data Warehouse Datahub Testing
•    Experience in writing complex SQL PL SQL statements to test ETL code based on the data mapping requirements provided and perform extensive data analysis to identify the defects
•    Strong Data ware house and BI concepts
•    Experience in working testing through Informatica or related ETL tools Oracle database BI tools like OBIEE etc and has experience working with UNIX LINUX
•    Experience with CCAR reports testing is a huge plus
•    Experience in working with large scale Enterprise Data Warehouse Data Integration Data Migration and upgrade projects
•    Experience in testing data warehouses systems and have defined test approaches for data warehouse projects
•    Experience in creating Test plan test cases and engineering best practices related to software test engineering both manual and automated testing
•    Experience coordinating testing activities and optimizing test cycles working with project team
•    Experience with conducting and running defect triage meeting with project teams
•    At least 3-5 years experience of working with financial services applications
•    Strong in STLC process
•    Test Strategy Plan Test Estimation Effective Project People Management Skills exposure to Knowledge Management
•    Solid time management and prioritization skills
•    Excellent verbal and written communication skills
Additional Good to have skills
•    Passion towards giving technical solutions and using different testing tools
•    Experience or knowledge of Big data tools such as Spark, NIFI, Kafka, Denodo technologies, Hive, NOSQL; Databases like HBase, Cassandra, MangoDB; BI tools like PowerBI, Zeppeline, etc
•    Experience in working data integration testing pipeline built using heterogeneous source systems like transactional databases files systems JSON delimited COBOL Parquet Avro etc
•    HDFS API and Webservices
•    Experience in source control tools like GitLab or GitHub and built DevOps CICT pipeline or similar
•    Experience in using Data Governance Metadata and data lineage tools like Schema Registry Atlas ABACUS etc
•    Experience in data masking tokenization detokenization process and testing the same
•    Experience handling multiple assignments and be a team player

Disclaimer 

Capgemini is an Equal Opportunity Employer encouraging diversity in the workplace. All qualified applicants will receive consideration for employment without regard to race, national origin, gender identity/expression, age, religion, disability, sexual orientation, genetics, veteran status, marital status or any other characteristic protected by law.  

This is a general description of the Duties, Responsibilities and Qualifications required for this position. Physical, mental, sensory or environmental demands may be referenced in an attempt to communicate the manner in which this position traditionally is performed. Whenever necessary to provide individuals with disabilities an equal employment opportunity, Capgemini will consider reasonable accommodations that might involve varying job requirements and/or changing the way this job is performed, provided that such accommodations do not pose an undue hardship.  

Click the following link for more information on your rights as an Applicant – http://www.capgemini.com/resources/equal-employment-opportunity-is-the-law  

Applicants for employment in the US must have valid work authorization that does not now and/or will not in the future require sponsorship of a visa for employment authorization in the US by Capgemini.  

Ref:

1112132

Posted on:

December 24, 2021

Experience level:

Experienced (non-manager)

Contract type:

Permanent Full Time (us-en)

Business units:

FS (us-en)

Department:

Financial Services