Client is leading insurance company in Japan and creating big data hub using latest Hadoop framework. Data from various sources will be ingested into Data hub, it will be cleaned, transformed and used for analysis.
Roles and Responsibilities:
- Install and configure Hortonworks clusters
- Apply proper architecture guidelines to ensure highly available services
- Plan and execute major platform software and operating system upgrades and maintenance across physical environments
- Develop and automate processes for maintenance of the environment
- Implement security measures for all aspects of the cluster (SSL, disk encryption, role-based access via Apache Ranger policies)
- Ensure proper resource utilization between the different development teams and processes
- Design and implement a toolset that simplifies provisioning and support of a large cluster environment
- Review performance stats and query execution/explain plans; recommend changes for tuning Apache Hive queries
- Create and maintain detailed, up-to-date technical documentation
Requirements and Qualifications:
- Two years of experience working with Apache Hadoop
- In-depth knowledge of Apache Hadoop and MapReduce
- Experience with Apache HBase and Hive
- Experience with Linux
- Ability to shell script with Linux
- Ability to troubleshoot problems and quickly resolve issue
- Master’s degree in computer science or computer information systems will add advantage, but mandatory.
Proficient in Japanese is a must. English will be required, at least Business level both in writing and reading.