Job Views:  
42
Applications:  3
Recruiter Actions:  2

Posted in

IT & Systems

Job Code

1067007

Abzooba - Azure Solution Architect

10 - 15 Years.Pune
Posted 2 years ago
Posted 2 years ago

Primary Skills (Must have):

- At least 10 years of experience working across/understanding one or more ETL/Big Data tools and Cloud Computing Platforms preferably on Azure

- Strong working experience in Azure Services like ADF, ADLS Gen2, Azure Databricks, Synapse Analytics, Event Hub, CosmosDB, Azure Stream Analytics.

- At least 5 years of experience in Data warehousing and designing solutions on Modern Data Lake.

- At least 5 years of experience working in Big Data Technologies

- Strong understanding and hands-on experience on the Big Data stack (HDFS, Sqoop, Hive, Java, etc.)

- Big Data solution design and architecture

- Design, sizing, and implementation of Big Data platforms based

- Experience in extracting data from feeds into Data Lake using Kafka and other open-source components

- Understanding of and experience in Data ingestion patterns and experience with building pipelines.

- Experience work on Production grade projects with Terabyte to Petabyte size data sets.

Secondary skills (Good to have):

Experience working in any other cloud platform such as AWS/GCP will be an added advantage

Working knowledge around implementing a Streaming data-based solution

Experience in implementing data governance

Experience in cloud data warehouses like a snowflake.

Roles & Responsibilities:

Total Experience: 10+ years

- Proficient in Spark (Python/Scala)

- Hands-on with Databricks Spark

- Experience designing and implementing modern data lake on the Azure platform.

- Proficient with data-driven Azure services like ADF, ADLS, Azure Databricks, Synapse Analytics, Azure SQL, Event Hub, Event Grid, Azure Stream Analytics, PowerBI, Azure Functions, Data security consideration, and best practices on Azure Platform.

- Proficient with the latest Big data tools/technologies like Hive, Hadoop, Yarn, Nifi, Kafka, Spark Streaming

- Proficiency in any of the programming languages: Scala, Python, or Java

- Good to have knowledge on working with Presto

- Good to have experience in Apache Nifi

- Must have conceptual knowledge of Data Lake and ETL

- Implemented complex projects dealing with the considerable data size (TB/ PB) and with high complexity in the production environment.

- Demonstrated strength in data modeling, ETL development, and Data.

Required Skillset:

- Good analytical and problem-solving skills for design, creation, and testing of programs.

- Good communication skills to interact with team members, support personnel, and provide technical guidance and expertise to customers and management.

- Good interpersonal skills to interact with customers and team members.

- Should be ready to explore or work with Cloud technologies.

- Ability to work in a self-directed work environment.

- Ability to work in a team environment.

Didn’t find the job appropriate? Report this Job

Job Views:  
42
Applications:  3
Recruiter Actions:  2

Posted in

IT & Systems

Job Code

1067007

UPSKILL YOURSELF

My Learning Centre

Explore CoursesArrow