Roles and Responsibility
1. He/she will be responsible for creating strong technical depth in the team, working on end to end product development, architecture frameworks, and product features.
2. He/she should be a thought leader who can challenge the status quo, lead ideation, prototyping activities to tune the current product stack. A key part of the responsibility would be to enhance the resilience of the product in production, identifying NFRs and getting them incorporated in the product. The architect will work closely with agile engineering teams to take the product to market
3. Provides technical leadership in BigData space (Hadoop Stack like M/R, HDFS, Pig, Hive, HBase, Flume, Sqoop, etc..NoSQL stores like Cassandra, HBase etc). Work closely with product management and development team to understand functional requirements and propose an architecture, NFRs, develop proof of concept to validate technical unknowns.
4. Work along with senior architect in the team and play a key role in developing architecture frameworks, evaluate NFRs, propose technical improvements and develop them.
5. Should be hands-on and build key modules in the product - frameworks and features. Work with development team to build a part of the product and take it to production. Be the technical SME in the team and lead design reviews, code reviews to ensure that code quality is impeccable. Support troubleshooting the product in production for high severity issues impacting customers.
6. Analyze the current solution stack, proactively identify architecture improvement opportunities, prepare proposals & prototypes, guide the team to build the NFRs for increased production stability, automate processes as much as possible and reduce manual maintenance effort
7. Evaluate and recommend Big Data technology stack for the platform
8. Use technical influence to drive deliveries, innovation and engineering standards/best practices across Engineering.
9. Drive significant technology initiatives end to end and across multiple layers of architecture
10. Drives operational excellence through root cause analysis and continuous improvement for BigData technologies and processes.
11. Provide technical leadership and be a role model to data engineers.
Skills & Knowledge - Technical / Functional and Managerial
- 9-10 years of experience working in a distributed product development environment with 3+ experience in designing data pipelines, ETL, complex event processing, analytics components using big data technology (Hadoop/Scala/Spark/Cassandra/MongoDB/Kafka/Pig/Hive ). Should have hands on experience developing two enterprise product using big data technologies. The products should be live in production
- Should have excellent technical knowledge of distributed platforms acquired via hands-on development experience. Extremely strong in one technology area (Big data, distributed systems design) and having a technical width on multiple areas. Should have developed at least 2-3 enterprise products with strong SI capabilities from conceptualization to go-live and post go-live support apart from the big data product development experience
- Should have exposure to complete product development lifecycle including production support, troubleshooting performance issues in production, stability NFR
- Should have prior experience working with product management to prioritize roadmap items for delivery and working closely with PM teams for building the product and taking it to the market, participating in pre-sales workshops to understand market gaps and delivering the same.
- Should have excellent communication, presentation, and negotiation skills. Should be assertive, able to resolve conflicts and gain consensus in a cross-functional team
- Full knowledge of Hadoop Architecture and HDFS is a must
- Proven track record of architecting and building large scale, distributed big data solutions
- Hands-on experience in Big Data Hadoop, Spark, No SQL like Cassandra, CouchDB, HBase etc. Database like SQL Server, MySQL etc
- Demonstrable excellence in innovation, problem-solving, analytical skills, data structures, and design patterns
- Designed and build the full life-cycle of big data solution including requirement analysis, technical architecture design, solution design, solution development, testing & deployment
- Experience with Enterprise Java and web application
- Strong design and coding skills
- Past experience with working on agile teams, scrum master, PMI-ACP certification is a plus
- Hands on knowledge of Spark/Scala, Cassandra, Mongo DB is a plus
- Knowledge of travel domain is a plus
Interested candidates can apply or can feel free to reach me at 9035004698
Didn’t find the job appropriate? Report this Job