Hadoop Developer for Global Insurance Client

Job Overview
Our client is leading insurance company in Japan and creating big data hub using latest Hadoop framework. Data from various sources will be ingested into Data hub, it will be cleaned, transformed and used for analysis. For integrate different source systems like Mainframe, Oracle DB we are going to use Informatica PCDQPWX & MDM tools.

Responsibilities

  • Requirement analysis
  • Data Ingestion using Sqoop, automated python scripts
  • Data transformation using Spark and HiveQL
  • Troubleshooting and Optimization of complex queries
  • Help to Informatica developers on Hadoop issues
  • Basic Hadoop administration and job monitoring

Requirement

  • In depth understanding of distributed environment
  • Working experience on Hadoop framework including HDFS, Hive, HBase, MapReduce, Pig, Oozie, Tez.
  • Experience on RDBMS and SQL
  • Experience working with large data sets, experience working with distributed computing (MapReduce, Hadoop, Hive, Pig, Apache Spark, etc.).
  • Experience on data Ingestion using Sqoop, Understanding of CDC technologies and Apache NiFi
  • Experience in Unix/Linux shell scripting, python and Java
  • Data transformation using spark, streaming
  • Experience in ETL
  • Good communication skills

Ref:

2019-JP-135

Posted on:

October 27, 2019

Experience level:

Experienced (non-manager)

Education level:

Bachelor's degree or equivalent

Contract type:

Permanent

Location:

Tokyo

Department:

IT Solutions

cookies.

By continuing to navigate on this website, you accept the use of cookies.

For more information and to change the setting of cookies on your computer, please read our Privacy Policy.

Close

Close cookie information