Hadoop Developer for Global Insurance Client

Job Overview
Our client is leading insurance company in Japan and creating big data hub using latest Hadoop framework. Data from various sources will be ingested into Data hub, it will be cleaned, transformed and used for analysis. For integrate different source systems like Mainframe, Oracle DB we are going to use Informatica PCDQPWX & MDM tools.

Responsibilities

  • Requirement analysis
  • Data Ingestion using Sqoop, automated python scripts
  • Data transformation using Spark and HiveQL
  • Troubleshooting and Optimization of complex queries
  • Help to Informatica developers on Hadoop issues
  • Basic Hadoop administration and job monitoring

Requirement

  • In depth understanding of distributed environment
  • Working experience on Hadoop framework including HDFS, Hive, HBase, MapReduce, Pig, Oozie, Tez.
  • Experience on RDBMS and SQL
  • Experience working with large data sets, experience working with distributed computing (MapReduce, Hadoop, Hive, Pig, Apache Spark, etc.).
  • Experience on data Ingestion using Sqoop, Understanding of CDC technologies and Apache NiFi
  • Experience in Unix/Linux shell scripting, python and Java
  • Data transformation using spark, streaming
  • Experience in ETL
  • Good communication skills

Ref:

2019-JP-135

投稿日:

2019年10月28日

経験レベル:

Experienced (non-manager)

学歴レベル:

Bachelor's degree or equivalent

契約タイプ:

Permanent

勤務場所:

Tokyo

Department:

IT Solutions

クッキー

By continuing to navigate on this website, you accept the use of cookies.

For more information and to change the setting of cookies on your computer, please read our Privacy Policy.

閉じる

クッキーの情報を閉じる