Responsibilities
- Solid experience with Big Data Technologies (Hadoop, Hive, Hbase, Pig, Spark, etc.)
- Experience in developing backend components in Python/Java/C++ including RESTful APIs with knowledge of JSON based messaging format
- Experience in Java and/or Python programming
- Good understanding of software development methodologies and processes
- Expertise with Hadoop ecosystem technologies and tools (HDFS, YARN, HBase, Spark, Flume, Kafka, etc.)
- Experience in creating automated standard Hadoop deployments (Ansible, Puppet, Forge, etc.)
- ETL experience (programming and modeling) preferred
- Use of Microservices based Architecture
Experience : 8+ Years
Work location : Hyderabad