Hadoop Developer(Kafka)-4 to 6 yrs_Hyderabad
– Work closely with Solution Architect & Infrastructure Architect to deliver technologies and services including CDC, Kafka, Ni-FI and other emerging technologies to product teams.
– Partner with software development team to implement best practices and optimize performance of Data applications.
– Participate in the analysis of new requirements and develop solutions and services to support them.
– Research new Big Data technologies, assessing maturity and alignment of the technology to business and technology strategy.
– Work close with data architecture and infrastructure architecture to finalize solution design document
– Build, setup hortonworks data flow platform and CDC service base on the design document
• Experience operating CDC, Kafka, Ni-FI, Kinesis, Distributed stores (such as HBase, Hive, Presto) and file systems (such as S3/HDFS)
• Previous experience scaling up with zero-downtime big data technologies would be a plus
Knowledge about deployment and maintenance of Java backend-end applications