Posted By

Job Views:  
105
Applications:  25
Recruiter Actions:  3

Posted in

IT & Systems

Job Code

1236359

Decision Point - Manager - Data Engineering

Posted 1 year ago
Posted 1 year ago

- You will lead and manage the delivery of projects and be responsible for the delivery of projects and team goals.

- Build & support data ingestion and processing pipelines. This will entail extract, load and transform of data from a wide variety of sources using the latest data frameworks and technologies.

- Design, build, test, and maintain machine learning infrastructure and frameworks to empower data scientists to rapidly iterate on model development.

- Own and lead client engagement and communication on technical projects. Define project scopes and track project progress and delivery.

- Plan and execute project architecture and allocate work to team.

- Keep up to date with advances in big data technologies and run pilots to design the data architecture to scale with the increased data volume.

- Partner with software engineering teams to drive completion of multi-functional projects.

Requirements:


- Minimum 4 years of overall experience in data engineering and 2+ years leading a team as team lead and doing project management.

- Strong knowledge of advanced SQL, data warehousing concepts, DataMart designing.

- Have strong experience in Python. Pyspark/Spark experience will be plus.

- Hands on knowledge of at least 1 cloud platform - AWS/Azure/GCP. Data Engineering/Solution architect certification a plus.

- Experience working with a global team and remote clients.

- Hands on experience in building data pipelines on various infrastructures.

- Knowledge of statistical and machine learning techniques. Hands on experience in integrating machine learning in data pipelines.

- Ability to work hands-on with the data engineers in the team in design and development of the solution using the relevant big data technologies and data warehouse concepts.

- Experience with setting up and maintaining Data warehouse (Google BigQuery, Redshift, Snowflake) and Data Lakes (GCS, AWS S3 etc.) for an organization.

- Experience in building data pipelines with AWS Glue, Azure Data Factory and Google Dataflow.

- Experience with relational SQL and NoSQL databases, including Postgres and Cassandra / MongoDB.

- Strong problem solving and communication skills.

Didn’t find the job appropriate? Report this Job

Posted By

Job Views:  
105
Applications:  25
Recruiter Actions:  3

Posted in

IT & Systems

Job Code

1236359

UPSKILL YOURSELF

My Learning Centre

Explore CoursesArrow