Industry - IT
Category - IT & Systems
Job Type - Permanent
Job Description - Define Business roadmap for Next Generation Big Data platform and deliver competitive products/services on Cloud. We are looking for architects who have worked on Bigdata Datalake R&D environment and worked on cloud-based analytics layers, contributed to maximized query performance, designed and implemented ACID transactions, scalable metadata handling, solution unifying streaming and batch data processing.
Client Details
Our client is one of the world's largest multi billion global technology company. They are hiring the Chief Architect - Datalake for one of their biggest R&D centres.
Description
- Define short term and long term Business roadmap for Next Generation Big Data platform that can address Enterprise customer's pain points on cloud and deliver competitive products/services on Cloud.
- Drive technology innovations, Patents in Bigdata and cloud computing domain.
- Actively engage in business plan and Roadmap preparation with Head Quarters, Drive optimizations and enhancement to Bigdata cloud Data Lake services Solutions.
- Architect and design next-gen bigdata solutions for Enterprise customers, Public and Hybrid cloud offerings.
- Understand the challenges of big data enterprise customers, design and innovate technologies or solutions for leveraging the Cloud Data Lake by driving various POCs through collaboration with internal and external partners.
- Guide team of senior architects working on Bigdata Kernel projects
Profile
- 10+ Years of relevant experience, Very good understanding and experience of Big Data Ecosystem. Business development experience. System Engineering, Product/platform Architecture experience.
- Experience in building and optimizing Cloud Data Lakes for Enterprise Bigdata use cases
- Expert in Core Java. Hands-on with design and implementation.
- Experienced in Open Source based commercial project delivery process and Engineering methods
- Excellent analytical and problem-solving skills.
- Familiar with Data lake platforms such as SnowFalke, Databricks Delta Lake, Azure Data Lake, Qubole, Google BigQuery, etc
- Experience with large-scale, distributed systems architecture, design and development
- Experience internals of Core Big data components-Spark, Presto, Hadoop, HBase, Hive, Kafka, ES. Preferably contributed to one or more component's design and code in the open source community.
Job Offer
- Opportunity to architect the core cloud kernel for the world's leading cloud company
- Excellent remuneration and learning opportunities for the right candidate
For a confidential discussion about this role please contact Jyoti Agrawal on + 91 080 6826 6822.
The Apply Button will redirect you to website. Please apply there as well.
Didn’t find the job appropriate? Report this Job