Job Title: Big Data Developer
Duration: 6 Months
Location: Atlanta, GA
Must have Java/Scala, Kafka, Spark, Hadoop experience.
• 7 - 10 Years of Experience in Data Analytics platform using Big Data Technologies of Hadoop Ecosystem
• Ability to Architect and build a Big data infrastructure
• Proficient in Hadoop, Map-Reduce, HDFS and Hive
• Experience with Real time analytics using stream processing frameworks such like Spark / Storm – Mandatory
• Experienced with at least 2 implementations on analytics on streaming data.
• Experience with Spark, Spark Streaming and Spark SQL
• Good experience in working with Kafka messaging system
• Good knowledge of Big Data querying tools, such as Pig, Hive, and Impala
• Knowledge in Sqoop, Flume, nifi preferred
• Experience in Casandra, MongoDB, and other platforms preferred
• Experience in implementing Master Data Management or Customer 360 analytics using Hadoop technologies preferred.
• Java/Scala knowledge required.
• Excellent communication skills with both Technical and Business audience
Please send profiles as per requirement only