Highlights:
One of the top partners of Databricks, AWS, Microsoft and Qlik.
Exponentia.ai has been awarded "Rising Star APJ Databricks Partner Awards 2023".
Qlik's "Most Enabled Partner Award - APAC."
"Most Trusted AI Solutions Providers"- by The Enterprise world & Registered member of NASSCOM community.
Get to know more about us on our- website:- http://www.exponentia.ai/- and- Life- @Exponentia.
Job Details:
Position: Data Architect - Azure
Experience: 8+ Years
Location: Pune
Required Skills: Azure Delta Lake (ADL), Azure Databricks (ADB), PySpark, SQL, Python
Role Overview:
Exponentia.ai is seeking a skilled and experienced Data Architect with expertise in Azure to join our dynamic team. Data Architect is responsible for designing technical solution and overseeing the implementation of solutions on Azure/AWS and Databricks including the high level and low level designs, data models and related aspects of the solution. This role requires client interfacing on solutions, architecture approvals and ensuring code reviews to ensure best practices are implemented.
Job Responsibilities:
- Interact with stakeholders regarding data landscape understanding, conducting discovery exercise, develop proof of concepts and demonstrate it to stakeholders.
- Engage with the sales team and clients early in the project lifecycle - the engagement may be planned for presales or discovery workshops as per the requirements.
- Design the data solution including Delta lake, ELT/ETL, Data Models, Data Quality Management, DataOps Practices and Data Governance and guide the implementation team.
- Ensure code reviews, talent upskilling (if needed) and best practices implementation.
- Apply best practices during design in data modelling (logical, physical) and ETL pipelines (streaming and batch) using cloud-based services.
Technical Skills:
- Has more than 4 Years of experience in developing data lakes, data warehouse on Azure.
- Proven skill sets in Azure Data Lake services such as - ADLS, Azure data Factory, Synapse, SQL Server, SSIS, Databricks/Apache Spark, DevOps using terraform and skills in Python
- Must have good understanding of Data Modeling, Data Quality, DevOps and Data Governance practices, system architectures, design patterns should be able to design and develop applications using these principles.
- Familiarity with data warehousing concepts and technologies.
- Excellent communication and interpersonal skills.
- Ability to work collaboratively in a fast-paced, dynamic environment.
Personality Traits:
- Good collaboration and communication skills
- Excellent problem-solving skills to be able to structure the right analytical solutions.
- Strong sense of teamwork, ownership, and accountability
- Analytical and conceptual thinking
- Ability to work in a fast-paced environment with tight schedules.
- Good presentation skills with the ability to convey complex ideas to peers and management.
Education/Qualification:
MBA/BE/MCA/MS/M.Tech/B.Tech.
Why join Exponentia:
- Expand your knowledge and work with cutting-edge technologies.
- Opportunity to work with some of the best minds and collaborate with them.
- Learn from Industry experts.
- Get Certified with latest technologies and platforms.
- Get access to networks of OEM partners and business leaders who are setting new standards at the cutting-edge of technology.
Didn’t find the job appropriate? Report this Job