Technical Specialist - BigData


more than 180,000 people in over 40 countries, Capgemini is a global leader in
consulting, technology and outsourcing services. The Group reported 2015 global
revenues of EUR 11.9 billion. Together with its clients, Capgemini creates and
delivers business, technology and digital solutions that fit their needs,
enabling them to achieve innovation and competitiveness. A deeply multicultural
organization, Capgemini has developed its own way of working, the Collaborative
Business Experience™, and draws on Rightshore®, its worldwide delivery model.

more about us at

is a trademark belonging to Capgemini.

is an Equal Opportunity Employer encouraging diversity in the workplace. All
qualified applicants will receive consideration for employment without regard
to race, national origin, gender identity/expression, age, religion,
disability, sexual orientation, genetics, veteran status, marital status or any
other characteristic protected by law.

is a general description of the Duties, Responsibilities and Qualifications
required for this position. Physical, mental, sensory or environmental demands
may be referenced in an attempt to communicate the manner in which this
position traditionally is performed. Whenever necessary to provide individuals
with disabilities an equal employment opportunity, Capgemini will consider
reasonable accommodations that might involve varying job requirements and/or
changing the way this job is performed, provided that such accommodations do
not pose an undue hardship.

the following link for more information on your rights as an

Location: Littleton, CO

Hadoop, Spark, Kafka, Scala, Data streaming, HDFSGather
and process raw data at scale (including writing scripts, etc.).Techical
expertise on Hadoop, Spark, ScalaWork
closely with our engineering team to integrate your amazing innovations and
algorithms into our production systems.Process
unstructured data into a form suitable for analysis – and then do the analysis.Support
business decisions with ad hoc analysis as needed.Experience
in core storage domain Linux Systemsexperience
using the technologies in the Hadoop ecosystem such as: Hadoop, HDFS, Spark,
MapReduce, Pig, Hive, Flume, Sqoop, KafkaExperience
with building stream-processing systems, using solutions such as
with integration of data from multiple data sourcesExperience
with NoSQL databases, such as Cassandra,Experience
with various messaging systems, such as Kafka or RabbitMQExperience
with Cloudera/HortonworksGood
if storage, Linux,virtualization certified. Technical Architecture of storage productDesign,
develop, and implement Hadoop-based big data solutionsexperience
using the technologies in the Hadoop ecosystem such as: Hadoop, HDFS, Spark,
MapReduce, Pig, Hive, Flume, KafkaExperience
applying interpersonal abilities to client-facing systems implementation
to high availability configurations, Hadoop cluster connectivity and tuning and
Hadoop security configurationsGood
understanding of relational databases and solid SQL skillsWork
directly with clients to achieve their business goalsIdentify
gaps and opportunities for improvement of existing solutionsDefine
and develop APIs for integration with various data sources in the enterpriseBe
an active hands-on team member, collaborating with other developers and
architects in developing client solutions