Big Data Developer

About Capgemini <?xml:namespace prefix = o ns = "urn:schemas-microsoft-com:office:office" />

With more than 170,000 people in over 40 countries, Capgemini is one of the world's foremost providers of consulting, technology and outsourcing services. The Group reported 2014 global revenues of EUR 10.5 billion. Together with its clients, Capgemini creates and delivers business and technology solutions that fit their needs and drive the results they want. A deeply multicultural organization, Capgemini has developed its own way of working, the Collaborative Business ExperienceTM, and draws on Rightshore®, its worldwide delivery model.

Learn more about us at http://www.capgemini.com/ .

 

Rightshore ® is a trademark belonging to Capgemini

Capgemini is an Equal Opportunity Employer encouraging diversity in the workplace. All qualified applicants will receive consideration for employment without regard to race, national origin, gender identity/expression, age, religion, disability, sexual orientation, genetics, veteran status, marital status or any other characteristic protected by law.

 

Must Have Skills:-

Minimum 1.5 to 2 years Big Data (Hadoop) experience

With overall 5+ years of overall IT experience in Data Management, Data Warehousing, Business Intelligence, and Analytics.

Multiple (2+) project implementations with Big Data technologies such as Hadoop, NoSQL, and MPP Data Warehouse platforms

Hands on experience designing, developing, and maintaining software solutions in Hadoop Production cluster.

Demonstrates broad knowledge of technical solutions, design patterns, and code for medium/complex applications deployed in Hadoop production.

Demonstrable and relevant HANDS-ON experience on Big data platform based on Hadoop ecosystem including HDFS operations, HIVE , PIG, HCatalog, Oozie Workflow, and NoSQL data stores particularly Key-Value pair like HBASE.

Hands on experience with Apache Spark, Flume, Scoop, Scala and Kafka are must.

Should be Experience in build (maven/sbt/ant/ any other build tool), code migration techniques, and production deployment strategies.

Knowledge of working in UNIX environment with good amount of shell scripting.

 

Preferred Skills:-

Knowledge in data warehouse and ETL layer would be preferred.

Knowledge in spring, Hibernate, Core Java, MapReduce is a plus.

Location:-NYC,NY