Hadoop Enterprise Architect - Global Life Insurance Client

Duties and Responsibilities:
- Strong data management / data strategy experience covering design and architecture experiences in Data Warehousing, Data mart, Data Quality, Cloud, Big data, Data modelling solutions/concepts.
- Strong data management / data strategy experience covering design and architecture experiences in Data Warehousing, Data mart, Data Quality, Cloud, Big data, Data modelling solutions/concepts.
- Should bring in Industry best practices thought leadership in data management / big data space.
- Brings in-depth expertise in Data Architect discipline to support the program, Lead Consulting Engagement from onsite.
- Support solution and proposal for RFI/RFP.
- Support client on tech evaluations.
- Liaise with Key Stakeholders for critical decisions.

Requirements and Qualifications:
Experience:
• 10+ years of experience in data technology, data architecture, enterprise architecture roles.
• Lead a small - medium team of 3 to 10 members.
• 5+ years of experience in big data domain technologies (Hadoop & Hive & WebHadoop).
• 3+ years of experience in system development and project delivery

Skills:
• Strong knowledge on data model, with hands-on experience of data modeling.
• ETL knowledge is mandatory (Informatica ETL, IBM datastaging, data spider etc).
• Understand organized index table and Materialized View.
• Big data technnology on Hadoop & Hive & WebHadoop.

Preferable to have.
• Understand overall insurance business semantics.
• Experience of designing insurance NewBusiness/Policy Admin/Claims systems or data model.
• Experience of customer portal and CRM data model.
• Experience and knowledge from db2/oracle to postgresql on AWS Aurora.
• Data replication/consolidation/migration experience.
• Project management skills.

Ref:

2022-JP-392

投稿日:

2022年06月16日

経験レベル:

Experienced (non-manager)

学歴レベル:

Bachelor's degree or equivalent

契約タイプ:

Permanent

勤務場所:

Tokyo

Department:

Computers/Software