Responsibilities:
- Understand Architecture Requirements and ensure effective Design, to handle large scale database, analyzing the patterns of data to make strong and accurate business decisions.
- HDFS implementation, support and maintenance.
- Diligently teaming with the infrastructure, network, database, application and business intelligence teams to guarantee high data quality and availability.Development, Validation and Support activities.
- Good understanding of the technology and domain.
- Ability to lead a team towards a desired goal.
- Ensure continual knowledge management.
- Adherence to the organizational guidelines and processes.
- Technical and Professional Requirements-
- At least 8 years of hands on design and development experience on Big data related technologies - PIG, Hive, Mapreduce, HDFS, HBase, Hive, YARN, SPARK, Oozie, Java and shell scripting.
- Background in all aspects of software engineering with strong skills in parallel data processing, data flows, REST APIs, JSON, XML, and micro service architecture.
- Must have strong programming knowledge of Core Java or Scala.
- Must have hands on experience in design, implementation, and build of applications or solutions using Core Java/Scala.
- Strong understanding of Hadoop fundamentals.,Strong understanding of RDBMS concepts and must have good knowledge of writing SQL and interacting with RDBMS and NoSQL database - HBase programmatically.
- Strong understanding of File Formats - Parquet, Hadoop file formats.
- Proficient with application build and continuous integration tools - Maven, SBT, Jenkins, SVN, Git.
- Experience in working on Agile and Rally tool is a plus.
- Knowledge of Java Beans, Annotations, Logging (log4j), and Generics is a plus.
- Experience to Financial domain is preferred
Karthik - Senior Level and Leadership hiring
Didn’t find the job appropriate? Report this Job