Role: Hadoop Admin
Exp: 6 to 9 Years
• We are looking for a Hadoop Data Admin to design and develop consumer-centric low latency analytic applications leveraging Big Data technologies for our Enterprise Data Lake initiative.
• Outlining the job flows.
• Handling Hadoop Log Files.
• Supervising Hadoop jobs using scheduler.
• Performing cluster coordination services via Zookeeper.
• Assist MapReduce programs running on the Hadoop cluster.
• Pre-processing using Hive and Pig.
• Designing, developing, installing, configuring and maintain Hadoop.
• Preserve security and data privacy.
• High-speed querying.
• Managing and deploying HBase.
• Help build new Hadoop clusters.
• Come up with best practices.
• Build automation of deployment and configuration using open source frameworks
• Act as the subject matter expert for Big Data platforms and technologies
• Work across IT teams to ensure code quality, performance and scalability of deployed data products
• Perform other duties and/or special projects as assigned
• Bachelor's degree with minimum 2 years of IT experience in a quantitative field (such as Engineering, Computer Science, Statistics, Econometrics) Or in lieu of Degree, a High School Diploma/GED and 5 years of experience in quantitative field with programming (Java/J2EE) and data management experience
• Minimum 2 years of experience in deployment of BI & Analytics solutions using Big Data Technologies (such as – MapReduce, Kafka, HBase) in complex large scale environments preferably (20Tb+)
• Minimum 2 years of experience in at least 3 of the following: Pig, Sqoop, MapReduce, Kafka, Spark, Java
• Experience with Hortonworks Hadoop 2.4.x distribution
• Demonstrated excellent planning and organizational skills
• Engaging personality with experience collaborating across teams of internal and external technical staff, business analysts, software support and operations staff.
• Experience with Agile project management methods and practices.
• Familiarity with traditional BI Solution Architecture encompassing – ETL, CEP, DW, BI Reporting (preferably in a Unix/Linux, Oracle environment)