Opportunity for all those who are disparately looking for job Change.
And able to attend Face to Face interview on this Saturday. (It may change your Carrier)
JD:
- Good understanding of data warehouse concepts and design patterns
- Strong experience with Core Java or experience with HDFS, Map-reduce and other tools in Hadoop ecosystem
- Strong knowledge and hands-on experience with Map-reduce programming model and high level languages like pig or hive
- Experience with NoSQL data-stores like HBase, Cassandra
- Understands various configuration parameters and helps arrive at values for optimal cluster performance
- Knowledge of configuration management / deployment tools like Puppet / Chef -Setting up cluster monitoring and alerting mechanism tools like Ganglia, Nagios etc.
- Experience in setting up cross-data center replication -Mentors the team on performance optimization strategy and creates best-practices
- Understands how security model using Kerberos and enterprise LDAP product works and helps implement the same
- Strong experience on Big data technology platforms
- Must have worked on 3-4 projects on Hadoop
- Should have Hadoop Cluser sizing experience
- Should have knowledge on other big data tools - pig, hive, Hbase, Ambari,
- Should be excellent in communication & presentation skills.
Punam Poddar
Didn’t find the job appropriate? Report this Job