Discipline - Business Operations
Industry - Business Analysis
Job Responsibilities :
- Represent the organization as the principal customer contact and perform project leadership
- Work with customers and product line management to identify, refine, and translate customer needs into concrete technical requirements.
- Demonstrated leadership and team-building abilities
- Design and implement Data Platforms for large-scale, high performance and scalable requirements, integrating data from several data sources, managing structured and unstructured data while melding existing warehouse structures.
- Take end-to-end responsibility of the Hadoop Life Cycle in the project
- Be the bridge between data scientists, engineers and the project needs
- Do in-depth requirement analysis and exclusively choose the work platform
- Full knowledge of Hadoop Architecture and HDFS is a must
- Ensuring the chosen Hadoop solution is being deployed without any hindrance
- Interacts with management and senior customer personnel on matters requiring coordination across organizational lines.
- Collaborate with various cross functional teams: infrastructure, network, database, application for various activities: deployment new hardware/software, environment, capacity uplift etc.
- Work with various teams to setup new Hadoop users, security and platform governance.
- Work with project team members to help propagate knowledge and efficient use of Hadoop tool suite and participate in technical communications within the team to share best practices and learn about new technologies and other ecosystem applications
- Ability to quickly adapt to a changing environment
- Candidate should be a self-motivated, independent, detail oriented, responsible team-player.
Required Skill set :
- 10+ years of experience in IT/Analytics industry.
- Experience in big data technologies (Hadoop ecosystem components, Spark, Columnar Databases, Java, Python, Machine Learning, Web Services, CI/CD and Integration tools)
- Experience leading teams and/or managing workloads for team members
- Data Warehouse modernization experience on Hadoop cluster is a must
- Understand the current Data Marts and Warehouse architecture and migrate it effectively to Hadoop cluster with efficient processing
- Experience in data migration from relational databases to Hadoop HDFS, CI/CD and automation tools
- Supporting the implementation and driving to stable state in production
- Good understanding of file formats like Parquet, ORC, JSON and sequence
- Good understanding of handling data streaming or real time data
- Open to work on Proof of Concepts, learn/operate and innovate new ideas
- Excellent Communication skills
- Managing client interactions effectively
- Lead a medium-sized team
- Application performance tuning and troubleshooting
The Apply Button will redirect you to website. Please apply there as well.
Didn’t find the job appropriate? Report this Job