Hadoop Developer for Insurance Client

Responsibilities

  • Providing engineering supports incoming projects and operations
  • Design, develop and maintain scala/spark applications on the Hadoop ecosystem
  • Supporting database performance tuning with discussing with the client
  • Supporting engineering security inspections
  • Basic Hadoop administration and job monitoring

Requirement

  • Sound understanding of the Hadoop ecosystem
  • Hands-on experience of Hadoop components: HDFS, Hive, HBase, Phoenix, Solr, Oozie
  • Minimum 3+ years of experience in Scala
  • Strong coding expertise with Scala and Spark.
  • Good understanding of database concepts and SQL
  • Experience with Unix and shell scripts
  • Good knowledge of git and sbt
  • Experience of database performance tuning for Oracle and SQL Server
  • Working experience on Hadoop framework including HDFS, Hive, HBase, MapReduce, Oozie, Phoenix, Solr
  • Experience in Scala, Spark 2.0 and tools such as Eclipse or Intellij
  • Experience in RDBMS and SQL, Unix/Linux shell scripting, python and Java
  • Data transformation using spark, streaming

Language Proficiency:

  • Fluent English and nice to have Japanese

Location: Tokyo

Ref:

2020-JP-117

投稿日:

2021年05月12日

経験レベル:

Experienced (non-manager)

学歴レベル:

Bachelor's degree or equivalent

契約タイプ:

Permanent

勤務場所:

Tokyo

Department:

Computers/Software