Big Data Developer

Spark Developer – 4 to 6 Years – Mumbai location

Hi there,

We have demands for Hadoop Developer with 4 to 6 Years of total experience at Mumbai location.

Contact: Varun (varun-shah.kapadia@capgemini.com)

Job Responsibilities
  • Data Modeling ERwin or equivalent tool experience Normalized, Denormalized, Star Snowflake, Kimball Inmon design concepts 3 to 5 years of strong hands on real time big data development experience in Hortonworks, Hadoop, Hive, Spark, PySpark, Scala, Sqoop, Kafka, Druid, Deep, SQL, data analysis.
  • Statistics background Data architecture on big data technologies
  • Experience working in multiple roles in Data Warehouse Analysis desired reporting desired ETL DBA
  • Experience working in financial industry a plus 3 years working on data models logical and physical 6 years working in a Data Warehouse environment Big data Hadoop 2 year working on Data Governance Data Management covering topics such as Data Quality Data Stewardship Data Security Metadata Data Retention desired Ability to work with huge volumes of data so as to derive Business Intelligence Analyze data uncover information derive insights and propose data driven strategies
  • Working Knowledge of languages like Java Pig Hive Python R Spark Scala Kafka Confluent Nifi Spark Streaming and etc
  • Knowledge of installing configuring maintaining performance tuning and securing Hadoop and its eco system
  • Solid hands on experience in architecting Db NoSQL Db solution
  • Primary Skills: Hadoop Hive Spark PySpark Scala R Sqoop Kafka Druid Hortanworks Nifi Java Pig Hive Python R Spark Scala

Ref:

363643

Posted on:

January 23, 2020

Experience level:

Experienced

Contract type:

Permanent

Location:

Bangalore

cookies.

By continuing to navigate on this website, you accept the use of cookies.

For more information and to change the setting of cookies on your computer, please read our Privacy Policy.

Close

Close cookie information