Can you help build new data pipelines, identify existing data gaps, and provide automated solutions to deliver advanced analytical capabilities and enrich data to applications that are supporting our client's operations team?
Do you have a strongly analytical mind which can assist with complicated problems ?
Are you a Self-starter who works with minimal supervision and the ability to work in a team of diverse skill sets, and has a basic understanding of Cyber security domains ?
Do you have strong foundation of understanding and work experience working with ...
Hadoop/Big Data related field
Hive, Spark, HBase, Sqoop, Impala, Kafka, Flume, Oozie, MapReduce, SOLR, Indexers, etc.
Programming experience in perhaps Java, Scala, Python, PHP, or Shell Scripting, etc.
End-to-end design and have built process of Near-Real Time and Batch Data Pipelines
SQL and Data modelling (very strong experience)
Your foundation of work must include ...
Responsibility for obtaining data from the System of Record and establishing real-time data feed to provide analysis in an automated fashion.
Experience working in Agile development process and deep understanding of various phases of the Software Development Life Cycle
Experience using Source Code and Version Control systems like SVN, Git, etc.
Deep understanding of the Hadoop ecosystem and strong conceptual knowledge in Hadoop architecture components
Ability to comprehend customer requests and provide the correct solution
Desire to resolve issues and dive into potential issues
Bachelor's degree required.