Big Data Architect

About
Capgemini

With 180,639 employees in over 40 countries in Europe,
North America, Latin America and Asia-Pacific, the Group reported consolidated
revenues of €11,915 million in 2015. Together with its clients, Capgemini
creates and delivers business and technology solutions that fit their needs and
drive the results they want. A deeply multicultural organization, Capgemini has
developed its own way of working, the Collaborative Business ExperienceTM, and
draws on Rightshore®, its worldwide delivery model.

Learn more about us at http://www.capgemini.com/ .

 

Rightshore ® is a trademark belonging to Capgemini

 

Capgemini is an Equal Opportunity Employer encouraging
diversity in the workplace. All qualified applicants will receive consideration
for employment without regard to race, national origin, gender
identity/expression, age, religion, disability, sexual orientation, genetics,
veteran status, marital status or any other characteristic protected by law.

 

Big Data Architect

 

Location:
Charlotte,
NC

 

Duties
& Responsibilities:

  • Design
    and implement scalable Big Data architecture solutions for various
    application needs.
  • Analyze
    multiple sources of structured and unstructured data to propose and design
    data architecture solutions for scalability, high availability, fault
    tolerance, and elasticity.
  • Develop
    conceptual, logical and physical design for various data types and large
    volumes.
  • Implement
    some or all of the Big Data systems in distributed cloud environments.
  • Implement
    security, encryption best practices for Big Data environments.
  • Architect,
    design and implement high performance large volume data integration
    processes, database, storage, and other back-end services in fully
    virtualized environments.
  • Collaborate
    with customer teams to formulate the problem, recommend a solution
    approach and design a data architecture
  • Work
    closely with the product management and development teams to rapidly
    translate the understanding of customer data and requirements to product
    and solutions
  • Participate
    in Rapid Application Development and Agile processes to deliver new cloud
    platform services and components.
  • Set
    architectural vision and direction across a matrix of teams.
  • Prototypes and POCs

Skill, Experience & General information Required:

 

Basic
Technical Requirements

 

Desirable
Skills

  • 10+ years of experience as a technology leader designing and
    developing data architecture solutions with more than 3+ years specializing in
    big data architecture or data analytics.

  • Strong, in-depth experience in data modeling and experience with
    business intelligence systems (dimensional modeling, data mining, predictive
    analytics).

  • Knowledge and experience with Big Data technologies such as
    Hadoop, NoSQL and Map-Reduce and other Industry BigData Frameworks

  • Broad based architecture acumen: Database architecture, ETL, SOA,
    Cloud, etc

  • Data Platform experience (MapR).

  • Comfortable in multi terabyte production environments

  • Highly proficient with large data sets and clusters

     

    Hands-On Technical Competencies

  • Good experience in NoSQL (MongoDB/Cassandara)

  • Good experience in hive scripts, Map reduce, handling incremental
    data

  • Good experience in SPARK,Pig/SQOOP/Kafka/Flume for extracting near
    real-time data & event messaging.

  • Good experience in offloading/loading data in RDBMS & HDFS

  • Java a big plus

  • Experience working with Cloud Storage solutions in AWS, Azure etc.
    is a plus.

     

Qualifications

  • 5+ years - Data Integration design and implementation

  • 3+ years - Large Scale Solution Architect  ,Design skills in
    Big Data, ETL Platform

  • 3+ years - Big Data/AWS/Azure cloud implementation

  • Experience in Insurance industry a plus