Good understanding of data warehouse concepts and design patterns
• Strong experience with Core Java or experience with HDFS, Map-reduce and other tools in Hadoop ecosystem
• Strong knowledge and hands-on experience with Map-reduce programming model and high level languages like Pig or Hive
• Experience with NoSQL data-stores like HBase, Cassandra
• Hands on experience in Solr and indexing of structure and unstructured data on Hadoop clusters
• Understands various configuration parameters and helps arrive at values for optimal cluster performance
• Knowledge of configuration management / deployment tools like Puppet / Chef Setting up cluster monitoring and alerting mechanism tools like Ganglia, Nagios etc.
• Understands how security model including using Kerberos and enterprise LDAP product works and helps implement the same.
• Has exposure to additional Hadoop security concepts like Knox and Sentry
• Experience in setting up cross-data center replication
• Mentors the team on performance optimization strategy and creates best-practices