Data Modeler for Global Insurance Company

Requirements and Qualifications:

Responsibilities

  • Understand and translate business needs into data models supporting long-term solutions
  • Work with the Application Development team to implement data strategies, build data flows and develop data models
  • Create logical and physical data models using best practices to ensure high data quality and reduced redundancy
  • Optimize and update logical and physical data models to support new and existing projects
  • Maintain logical and physical data models along with corresponding metadata
  • Develop the data model following the best practices for standard naming conventions and practices to ensure consistency of data models as per the company standards
  • Recommend opportunities for reuse of data models in new environments/projects
  • Evaluate data models and physical databases for variances and discrepancies
  • Validate business data objects for accuracy and completeness
  • Analyze data-related system integration challenges and propose appropriate solutions
  • Guide System Analysts, Engineers, Programmers and others on project limitations and capabilities, performance requirements and interfaces
  • Review modifications to existing databases to improve efficiency and performance
  • Past experience on ACORD Reference Architecture in an enterprise architecture framework for the insurance industry
  • At least one implementation of converting logical data models to columnar schemas in HBase or similar setup to enable data processing and querying patterns
  • Discover metadata of the source database, including value patterns and distributions, key candidates, foreign-key candidates, and functional dependencies       

Requirement

  • Strong data modeling skills required using Erwin
  • Experienced working on HDFS, HBase, Solr and RDBMS (Oracle/Sql Server)
  • Data structures and data management knowledge and proven experience
  • Experienced in SQL and NoSQL databases
  • Extended knowledge of Hadoop ecosystem and Spark is a plus
  • Exposure to new platforms and adapt to a continuous evolving ecosystem of multiple moving components involving Hive, HBase, Zeppelin, Spark, Livy, Solr, QlikSense, PowerBI and other BI tools
  • Basic engineering skills such as background in mathematics, statistics. Analytical thinking (able to drive reporting and analytical requests)
  • Knowledge on graph theory and graph databases is a plus

Language Proficiency:

  • Business level of English and Japanese good to have

Location: Tokyo

Ref:

2020-JP-145

Posted on:

May 6, 2021

Experience level:

Experienced (non-manager)

Education level:

Bachelor's degree or equivalent

Contract type:

Permanent

Location:

Tokyo

Department:

Computers/Software