Big Data+ETL Developer(4 to 6yrs)-Pune

Job Responsibilities

Job Role- Big Data+ ETL Developer

Exp- 4 to 6 yrs

• Experience in Big Data experience 
• Design & implement work flows using Unix / Linux scripting to perform data ingestion and ETL (Ab Initio) on Big Data platform.
• Excellent Understanding/Knowledge of Hadoop architecture and various components such as HDFS, Job Tracker, Task Tracker, high availability, HDFS job tracker, MapReduce, Spark RDDS/programming , Hive ,Pig , Kafaka & Flume.
• Must have excellent programing knowledge of Spark 
• Provide hands-on leadership for the design and development of ETL (Ab Initio) data flows using Big Data Ecosystem tools and technologies.
• Lead analysis, architecture, design, and development of data warehouse and business intelligence solutions.
• Define Cloud Data strategies, including designing multi-phased implementation roadmaps.
• Work independently, or as part of a team, to design and develop Big Data solutions
• For AWS, S3, Redshift, Elastic Map Reduce (EMR) and Cloudera certification is plus .&#x0D

Ref:

200540

Posted on:

July 17, 2018

Experience level:

Experienced (non-manager)

Education level:

Bachelor's degree or equivalent

Contract type:

Permanent

Location:

Pune

Department:

Financial Services

cookies.

By continuing to navigate on this website, you accept the use of cookies.

For more information and to change the setting of cookies on your computer, please read our Privacy Policy.

Close

Close cookie information