• Bachelor’s degree or foreign equivalent required from an accredited institution. Will also consider three years of progressive experience in the specialty in lieu of every year of education.
• At least 4 years of experience in IT industry.
• At least 4 years of experience in Requirement gathering, analysis, Architecture, Design, Data modeling and Development of DW and ETL solutions using AbInitio
• At least 3 years of experience in architecture, Design and implementation of large and complex Big Data and Hadoop implementations involving AbInitio or other similar products.
• At least 4 years of strong coding skills in JAVA
• At least 2 years of experience in ETL tool AbInitio with hands on HMFS and working on big data hadoop platform
• At least 2 years of experience implementing ETL/ELT processes with big data tools such as Hadoop, MapReduce, HDFS, PIG, Hive
• At least 1 years of hands on experience with NoSQL (e.g. key value store, graph db, document db)
• At least 2 years of solid experience in performance tuning, Shell/perl/python scripting
• Experience with Spark
• Experience with integration of data from multiple data sources
• Knowledge of various ETL techniques and frameworks
• At least 2 years of experience in Project life cycle activities on development and maintenance projects.
• Ability to work in team in diverse/ multiple stakeholder environments
• Experience in Finance and banking domain
• Strong Analytical skills