• Bachelor’s degree or foreign equivalent required. Will also consider one year of relevant work experience in lieu of every year of education.
• At least 4years of experience in DW/BI of which minimum 2years in Hadoop Administration (Cloudera/Hortonworks/Pivotal Distributions) and 1year in Hadoop MapReduce, PIG, HIVE, SQOOP, HBase, SPARK, Scala, Impala.
• At least 1year in monitoring tools like Nagios, Ganglia, Chef, Puppet and others.
• Should have experience with Cloudera Manager (OR) Ambari (OR) Pivotal Command Line Centre.
• Require experience with Capacity Planning & Performance Tuning on Large Scale Data Hadoop Environment.
• Should have experience with Hadoop Cluster Security implementation such as Kerberose, Knox & Sentry.
• Require experience with Cluster Design, Configuration, Installation, Patching, Upgrading and support on High Availability.
• At least 2years’ experience on the overall Hadoop eco-system (HDFS, Map Reduce, Pig/Hive, Hbase etc).
• Monitor Hadoop cluster connectivity and security
• Manage and review Hadoop log files.
• File system management and monitoring.
• HDFS support and maintenance.
• At least 3years of experience in working on Big Data Tools – MapReduce, PIG, HIVE, SQOOP, SPARK, SCALA & NOSQL.
• At least 2years of experience in Hadoop Cluster Administration.
• Strong Analytical skills
• Data modelling, design & implementation based on recognized standards.
• Software installation and configuration.
Database backup and recovery.
• Database connectivity and security.
• Performance monitoring and tuning.
• Disk space management.
• Software patches and upgrades.
• Automate manual tasks.
• Experience and desire to work in a Global delivery environment