Work closely with client technical heads, business heads, and business analysts to understand and document business and technical requirements and constraints.
Creates data collection, extraction, and transformation frameworks for structured and unstructured data.
Experience in architecting data pipelines and solutions for both streaming and batch integrations using tools/frameworks like Glue ETL, Lambda, Google Cloud DataFlow, Azure Data Factory, Spark, Spark Streaming, etc..
Organizes data into formats and structures that optimize reuse and efficient delivery to businesses and analytics teams and system applications.
Integrates data across data lake, data warehouse and systems applications to ensure the consistent delivery of information across the enterprise.
Experience with column-oriented database technologies (i.e. Big Query, Redshift, Vertica), NoSQL database technologies (i.e. DynamoDB, BigTable, Cosmos DB, etc.) and traditional database systems (i.e. SQL Server, Oracle, MySQL).
Perform data gap analysis, data profiling, identify data lineage, interpret and document data flow within the data pipelines.
5-7 years of experience with data engineering/migration/DWBI/reporting projects with a usage of query languages – PL-SQL/SAS-SQL/HQL/Python.
Expert in Hadoop, Hive, sqoop, python, Pyspark, Scala spark , Kafka, Redshift/Spectrum, Snowflake and other related big data related technologies and have related project experience.
Demonstrable experience in enterprise level data platforms involving implementation of end to end data pipelines with hands-on experience with at least one of the leading public cloud data platforms (Amazon Web Services, Azure or Google Cloud)
Proven experience in managing data warehouses and ETL pipelines with strong analytical skills and the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy.
Demonstrated experience with data integration, data governance, business intelligence technologies considered an asset.
Strong understanding of data modeling (dimensional, data vault, etc.) principles and techniques based on established methodologies (e.g., Kimball)
Working experience in dealing with big data and data manipulation.
Experience in designing & migrating batch ,streaming data from On-Prem Hadoop/Oracle to Azure ADLS.
Able to create HQL scripts and work on Hive tables for data analysis.
Expert SQL development skills with ability to write complex efficient queries for data integration.
Working Experience with cloud-native technologies.
Candidates should be flexible / willing to work across this delivery landscape which includes and not limited to Agile Applications Development, Support and Deployment.
Software Engineers perform requirements analysis. They then design, develop or maintain the physical application (components) or the application environment, based on the Software Architecture (models and principles). Activities include coding, integrating, implementing, installing or changing frameworks and standard components, or technical and functional application management. A Software Engineer also develops languages, methods, frameworks and tools, and/or undertakes activities in support of server-based databases in development, test and production environments.
Required Skills and Experience:
You have mastered several Software Engineering areas, applications or database environments. You act as a Software Engineering stream leader with technical delivery ownership within a (limited) number of technology areas. You contribute to bids or client proposals based on your technological expertise. You also act as a team leader with delivery ownership, and guide individuals and groups towards desired outcomes. You actively participate in at least one community and contribute to community discussions.
• Qualification: 7-10 years (3 years min relevant experience in the role) experience, Bachelor’s Degree.
• Certification: Should have SE Level 1 and seeking level 2.
• Should be proficient in Business Analysis, Business Knowledge, Software Engineering, Testing, Data Management, Architecture Knowledge and Technical Solution Design.
Capgemini is an Equal Opportunity Employer encouraging diversity in the workplace. All qualified applicants will receive consideration for employment without regard to race, national origin, gender identity/expression, age, religion, disability, sexual orientation, genetics, veteran status, marital status or any other characteristic protected by law.
This is a general description of the Duties, Responsibilities and Qualifications required for this position. Physical, mental, sensory or environmental demands may be referenced in an attempt to communicate the manner in which this position traditionally is performed. Whenever necessary to provide individuals with disabilities an equal employment opportunity, Capgemini will consider reasonable accommodations that might involve varying job requirements and/or changing the way this job is performed, provided that such accommodations do not pose an undue hardship.
Click the following link for more information on your rights as an Applicant – http://www.capgemini.com/resources/equal-employment-opportunity-is-the-law
Capgemini is a global leader in consulting, digital transformation, technology and engineering services. The Group is at the forefront of innovation to address the entire breadth of clients’ opportunities in the evolving world of cloud, digital and platforms. Building on its strong 50-year+ heritage and deep industry-specific expertise, Capgemini enables organizations to realize their business ambitions through an array of services from strategy to operations. Capgemini is driven by the conviction that the business value of technology comes from and through people. Today, it is a multicultural company of 270,000 team members in almost 50 countries. With Altran, the Group reported 2019 combined revenues of €17billion.
Visit us at www.capgemini.com. People matter, results count.