Job Location: Chennai
Experience: 5+ Years
Primary Skill: Hadoop
Responsibilities
- Good understanding of distributed computing principles
- Experience working with Big Data projects in telecom space
- Extensive development expertise in Spark and other Big Data processing frameworks (Hadoop, Storm, Spark, etc.)
- Experience in Installation, Configuration and implementation knowledge of Big Data frameworks
- Good knowledge of Big Data querying tools, such as Pig, Hive, and Impala
- End to end understanding of Data Management Lifecycle
- Ability to assess the platform and come up with recommendations and optimization approaches
- Experience with Big Data ML toolkits, such as SparkML, or H2O
- Knowledge of various ETL techniques and frameworks, such as Flume