Scala/Spark Developer for Global Insurance Client

Job Overview

Capgemini is creating a big data hub for our client using latest Hadoop framework and Spark. Data from various sources will be ingested into Data hub, it will be cleaned, transformed and used for analysis

Responsibilities

  • Scala and Spark code developer on a day to day basis.
  • Unit test writing and Test Driven Development.
  • Managing code using GIT/BitBucket and participate in Code Reviews.
  • Package and prepare code for deployment.
  • Prepare functional design of code and documentation of his/hers code.
  • Participate in requirements gathering. Participate in team meetings for coding best practices.

Requirements

  • Experience in Scala programming  5+ years
  • Experience of developing Apache Spark 3+ years 
  • Familiar in GIT(Bitbucket), SDLC(Bamboo, Aritifactory, SonarQube) will be preferable 
  • UNIX/Network/Shell scripting  3+ years
  • Hadoop knowledge will be preferable

Ref:

2019-JP-132

投稿日:

2019年10月18日

経験レベル:

Experienced (non-manager)

学歴レベル:

Bachelor's degree or equivalent

契約タイプ:

Permanent

勤務場所:

Tokyo

Department:

IT Solutions

クッキー

By continuing to navigate on this website, you accept the use of cookies.

For more information and to change the setting of cookies on your computer, please read our Privacy Policy.

閉じる

クッキーの情報を閉じる