Roles and Responsibility:
BTech or MTech in Computer Science or related technical discipline (or equivalent).
- 9-10 years of experience working in a distributed product development environment with 3+ experience in designing data pipelines, ETL, complex event processing, analytics components using big data technology (Hadoop/Scala/Spark/Cassandra/MongoDB/Kafka/Pig/Hive- ).
- Should have hands on experience developing two enterprise product using big data technologies. The products should be live in production
- Should have excellent technical knowledge of distributed platforms acquired via hands on development experience.
- Extremely strong in one technology area (Big data, distributed systems design) and having technical width on multiple areas.
- Should have developed atleast 2-3 enterprise products with strong SI capabilities from conceptualization to go-live and post go-live support apart from the big data product development experience
Didn’t find the job appropriate? Report this Job