Industry - IT
Category - IT & Systems
Job Type - Permanent
Job Description :
- Define Business roadmap for Next Generation Big Data platform and deliver competitive products/services on Cloud. We are looking for architects who have worked on Bigdata Datalake R&D environment and worked on cloud-based analytics layers, contributed to maximized query performance, designed and implemented ACID transactions, scalable metadata handling, solution unifying streaming and batch data processing.
- Drive technology innovations, Patents in Bigdata and cloud computing domain.
- Actively engage in business plan and Roadmap preparation with Head Quarters, Drive optimizations and enhancement to Bigdata cloud Data Lake services Solutions.
- Architect and design next-gen bigdata solutions for Enterprise customers, Public and Hybrid cloud offerings.
- Understand the challenges of big data enterprise customers, design and innovate technologies or solutions for leveraging the Cloud Data Lake by driving various POCs through collaboration with internal and external partners.
- Guide team of senior architects working on Bigdata Kernel projects
- Experience in building and optimizing Cloud Data Lakes for Enterprise Bigdata use cases
- Expert in Core Java. Hands-on with design and implementation.
- Experienced in Open Source based commercial project delivery process and Engineering methods
- Excellent analytical and problem-solving skills.
- Familiar with Data lake platforms such as SnowFalke, Databricks Delta Lake, Azure Data Lake, Qubole, Google BigQuery, etc
- Experience with large-scale, distributed systems architecture, design and development
- Experience internals of Core Big data components-Spark, Presto, Hadoop, HBase, Hive, Kafka, ES. Preferably contributed to one or more component's design and code in the open source community.
- Excellent remuneration and learning opportunities for the right candidate
Didn’t find the job appropriate? Report this Job